Search

Sunday, March 3, 2024

History and Future of Computers

 

Introduction

            In modern times, the computer has had an essential role in the continued sustainment and advancement of society. Computers are extremely versatile tools that can be used by companies, organizations, governments, and individuals on a daily basis. There are many forms a computer comes in such as desktops, laptops, macs, tablets, and smartphones. While they have only been widely used for the past more or less 30 years and smartphones for the past 15 years, computers have actually been around for much longer than we think, and they will continue to be universally used for the foreseeable future.

            It has reached a point where it is almost impossible to operate modern organizations with the same efficiency without the convenience offered digital documentation and the speed of online communication. Not only that, but digital documentation saves the organization money by using less physical paper on a daily basis. It is for this reason that I do not see a future where we completely abandon computers for something else. On the contrary, technology will continue to advance and constantly make computers more efficient at accomplishing daily, weekly, monthly, and yearly tasks.

                This topic relates to Information Technology because knowing the history of computers helps understand the computers of today and the future

History (Analytical Engines)

            While computers started gaining popularity in the 1980’s and 1990’s, the very first computer actually dates back to the 1800’s. In an article published on University of Minnesota’s website (paragraph 4):

In 1821 Babbage invented the Difference Engine to compile mathematical tables. On completing it in 1832, he conceived the idea of a better machine that could perform not just one mathematical task but any kind of calculation. This was the Analytical Engine (1856), which was intended as a general symbol manipulator, and had some of the characteristics of today’s computers.”

            While the functions of this original computer were minimal, it was the start of what we know today as computers. In 1848, an English mathematician and Babbage’s friend named Ada Lovelace successfully writes the very first computer program on how to calculate Bernoulli numbers. Five years later, in 1853, a Swedish inventor named Per Georg Scheutz and his son collaborated to design the first computer that computes tabular differences and prints the results known as the printing calculator. And to cap off the 1800’s, in 1890, a tabulating machine for punch-card system to help calculate the U.S. Census was designed by an American statistician name Herman Hollerith. This system saves the government several years of calculations and taxpayers approximately 5 million USD.

In the 1930’s, a couple of breakthroughs were engineered. In 1931, Vannevar Bush, at the Massachusetts Institute of Technology (MIT), invents the differential analyzer, which is a mechanical analog computer designed to solve differential equations by integration. It emerged as among the first advanced computing devices to be used operationally. Later, in 1936, the Turing machine was invented by British scientist Alan Turing. The Turing Machine was capable of computing anything that is computable and the concept of the modern computer is heavily inspired by his ideas.

Many advances like these slowly lead to the arrival of what is considered the world’s first modern computer in 1946.

History (Modern Computers)

            February 14, 1946, is the beginning of the modern computer that lives on to this day, the ENIAC (Electronic Numerical Integrator and Computer). It was the first electronic, programmable, and general-purpose digital computer. The ENIAC was funded by the U.S government during the Second World War and developed by Ursinus Professor of Physics John Muchly and University of Pennsylvania Professor J. Presper Eckert. The ENIAC is the basis of most computing that is used today, excluding A.I.

            In the 1960’s, the internet started as a method used by government researchers to share information. It was invented because computers back then were very large and immobile so in order to share stored information, magnetic computer tapes were sent thought the conventional postal system. But computers were not able to directly communicate with each other until January 1983 with the creation of a new communication system named Transfer Control/Internetwork Protocol (TCP/IP) which allowed different kinds of computers to communicate with each other which marks January 1, 1983 as the official birthday of the internet. On April 30, 1993, the internet became available to the general public when the World Wide Web (www.) was released into the public domain.

Current Developments

            Artificial Intelligence (A.I.) is not as new as one might this although it has different meanings depending on the use. For example, despite video game consoles essentially being computers, video game A.I. and computer A.I. are different things. For decades, video game A.I. has referred to how characters in a game are designed to respond to a programmed situation and a good example for this is Metal Gear Solid, a stealth video game where enemies are programmed to patrol a perimeter and react to how the player traverses in the environment.

This has evolved tremendously over the years and a clear example of is Metal Gear Solid V when compared to the first game; in the first game, the enemies just patrolled an area and the instant you were spotted, the entire area was alerted by your presence and if you hid for a minute or two, everything would go back to normal. But when looking at the latest entry, the fifth game, the basics are the same, the enemy patrols an area and reacts to how the player interacts with the environment. The difference is that now the A.I. learns the players playing patterns and counters it over time. For example, if the player uses a lot of sleeping gas, the enemies will start incorporating gas masks in their ranks, if the player relies on headshots, the enemies will start using helmets, and many more examples like this. Another A.I. advancement is the communication between outposts and bases; if a player is spotted and engages in combat, enemies can radio other outpost to request backup and to tighten security, this can be avoided by sabotaging communications. Over time, A.I. improves in many ways. DLSS (Deep Learning Super Sampling) is an A.I. program developed by Nvidia to multiply the performance of games by creating entirely new frames, displays a higher resolution, and improves the overall images quality of intense ray-traced content. In essence, it helps your game perform at a higher performance level without the need to lower your settings.

In contrast, computer A.I. is very different as it consists of a program designed to learn in real time, it was this way for years until there multiple advancements were made within the past year. Early versions of computer A.I. was in the form of chatbots and it stayed that way until the image generation function was vastly simplified for users. A.I. generated images has existed since the early 1970’s but did were not made popular until the boom in either late 2022 or early 2023. By this time, image generation was nigh-realistic with complications when it came to hands

Figure 1.

Example of how A.I programs generated hands

This, of course, was fixed in a few short months. Next came the popularity boom of A.I videos and voices, now when A.I. videos started, it was extremely uncanny as the A.I. generator had difficulty generating movement that kept a consistent look. On the other hand, A.I. voices are able to generate any person’s voice and tone as long as it has a prior sample to reference. While sounding only slightly robotic, A.I. voices were mostly convincing and have advanced to the point where an entire song can be sung by an A.I. generated voice. Recently, Open AI launched a new program called Sora, which is an A.I. video generated that creates photorealistic videos with smooth animations. While at the moment, there is no sound and it can only generate 60 second videos, I can see a near future where this program can generate entire music videos, films, and even video games.

All these examples are to show how much A.I. has grown in such a short time and to show the potential of how quickly technology can advance now, especially with the newly added help of A.I.

Figure 2

Image taken from an A.I. generated video by Sora

Future of Computers

            In the near future, I predict that A.I. will be the basis of most computer programs used in homes and organizations. One idea I have for the future is an entire server where a business keeps track of products, funds, and services and automatically creates analyses of all activities and automatically generates a report with graphics and comparisons to previous months and years. But A.I. would do more than just help with work, it will help with your own personal accounts, money management, and eventually take over the world.

            Modern computers may very well have already reach the pinnacle of design, encompassing a monitor or more, keyboard, mouse, and the CPU. So the goal is to improve the systems and programs, the software of the computer. In other words, continuously be able to process more and more information at a faster rate until the wait time is near negligible. In fact, we have already started this process a few years ago with the introduction of the SSD storage.

            In short, computers and A.I are the future of society unless some catastrophic event happens (Skynet) and we lose access to or control of these systems.

References

UMN. (n.d.). Who was Charles Babbage?. College of Science and Engineering. https://cse.umn.edu/cbi/who-was-charles-babbage#:~:text=This%20was%20the%20Analytical%20Engine,of%20Babbage%27s%20prototype%20computing%20machines

Gregersen, E. (2015, December 10). Ada Lovelace: The First Computer ProgrammerEncyclopedia Britannica. https://www.britannica.com/story/ada-lovelace-the-first-computer-programmer

Williamson, T. (2023, December 22). History of computers: A brief timeline. LiveScience. https://www.livescience.com/20718-computer-history.html

World’s first modern computer, born from an Ursinus professor, celebrates 75th Anniversary. • News • Ursinus. (2021, February 15). https://www.ursinus.edu/live/news/5455-worlds-first-modern-computer-born-from-an-ursinus#:~:text=On%20February%2015%2C%201946%2C%20Ursinus,ENIAC%2C%20the%20first%20modern%20computer

A Brief History of the Internet. A brief history of the internet. (n.d.). https://www.usg.edu/galileo/skills/unit07/internet07_02.phtml

 

 

 

 

 

No comments:

Post a Comment

History and Future of Computers

  Introduction             In modern times, the computer has had an essential role in the continued sustainment and advancement of society...