There is no doubt that the first word that crosses our mind when we talk about computing is revolution. This is mainly due to the fact that over the years, computer technology has not stopped evolving which led us to a scenario where we have at our disposal inventions and devices that, observing it from a very short perspective, just a few months ago it was almost impossible for them to exist.
This evolution of computing and technology is easily verifiable by any user, that is, it is not necessary be an expert to realize. Just look at the specifications of each new cell phone, tablet or laptop that is released to the market every few weeks. All of them offer greater processing capacity in the same space or some device that will place them above the rest in the competition.
Of course this benefits the user of computing devices, but on the other hand it also confuses many, as they think that their device, bought perhaps a few months ago, is already old and is useless. Despite all the edges, it cannot be argued that computing has forever transformed the way we had to do things.
In this post we are going to talk about informatics and how it relates to the modern world and people, clearly and understandably, without technicalities.
Informatics has reached a level that until very recently was unthinkable. We can speak of 30, 40 or 50 years, which in a way seems a lot, however in terms of technological evolution it is hardly a blink. This means that in a relatively short time, computing was born and expanded to unsuspected limits.
The capacity and performance that a simple tablet has today, a few decades ago could only be obtained with huge printed circuits and many electronic components, and before that, with huge structures filled with vacuum valves, coils, and miles of cables located in specially conditioned places.
Today, this evolution in computing has allowed us to build a world to our measure, where everything can be measured, cataloged and put instead. And although it seems that we are talking about some dystopian novel, the truth is that in these times technology is used much more for good than for evil. Among the advantages that computing offers us at present are air traffic control, medical and scientific research, communications and a long etcetera.
The truth is that without the advances that have been made in terms of computing in recent years, both in hardware and software, none of this would be possible. Computing advances, research is streamlined, new devices are created and with them computing advances again. Fortunately, this is the cycle of evolution that is giving so many good results to Humanity.
This is demonstrated by the enormous number of devices that we have in use at home, at work, in the car or even as part of our clothing, that grow at a dizzying rate, and that are increasingly powerful and capable. All these devices allow us to do things that we could not do before, and may even be beneficial to health.
What is computing?
In technical terms, computing is the science that performs automatic and rational processing of information as a support for knowledge and communications. It could also be considered as the set of applications of this science through the use of computers or other computer systems and their respective software or applications.
As the years go by, there are certain terms that although in a At the beginning they involved modern concepts that only the experts understood, with the advance and the massification of the fields related to these terms, they become reality everyday aspects in the life of all human beings. That is what happened with computing.
If we look for the etymological definition of the word computing We can find various references in different languages throughout the history of the 20th century. However, despite being a term that has managed to cross the language, social and cultural barriers of each region of the planet, the truth is that one of the first approaches to the concept emerged in the early sixties, when the French engineer Philippe Dreyfus used the word “informative” to name his company, which was named “Society uninformative Applique”.
In Spanish, the The company’s name translates as “Socialized from Informatics Applicant”, and precisely the concept contained in said term was intended to define the application of computers to store and process information, which is why it emerged from the acronym of the words “automatic information.”
However, there is an antecedent to the one mentioned, since in 1957, the German specialist Ka rl Steinbuch used the term “Informative” to title a document that was published in a specialized graphic edition.
Today, things have changed, thanks to the gigantic scientific advance that the computing sector has undergone, and this term that surely came up unexpectedly in search of a pun that would allow us to define a Specific activity has become a true applied science, which through the use of computer systems and devices allow the study and application of automatic information processing to be carried out.
Due To its mission, we can briefly establish that computer systems, both traditional and modern, to be recognized as such, must carry out three fundamental tasks, which are the acquisition of the information, followed by treatment of said information to ultimately obtain the transmission of the results provided by the treatment process.
For this method to have been possible, it was necessary to create a complete system that included its various phases, among which hardware and software are mainly included and highlighted. Computer science is actually what makes it possible for a set of disciplines to converge, offering the possibility that they will later be applied to the most varied and diverse areas of knowledge and human activity.
What is computing good for?
As we mentioned, computing is the science developed with in order to study the information and the procedures to process it, transmit it and automate it. In more human terms, it could be said that computing is the vehicle that allowed us to reach places that without the help of computers and information systems would have taken many more years to reach.
Computer science is a science It encompasses many of the technologies that humanity has created in order to make its relationship with the environment around it a little easier. The fact that it is nourished from such varied origins, allows it to be applied to all kinds of scenarios, and even improve or extend the range of applications in which the science from which it was developed can assist. An example of this is engineering; This field is no longer the same without the application of devices or computer data processing. The same happens in other areas such as business, the degree of efficiency of companies would not be the same without process management systems such as ERP, CRM or SAP.
But to answer the title question, we can say that computing serves to allow us to control and automate more efficiently all the processes that take place in an operation with electronic devices, in order to avoid possible errors in data manipulation, improve the execution time of these processes and other very complex tasks that would otherwise be very problematic to process.
Basically, this process consists of three steps:
- the input of information, that can occur through any peripheral input
- the treatment of said information within the device
- and as a third and last step the exposure or output of the information processed, which is carried out through any output peripheral
In a more practical explanation, computing is useful for creating applications, el device design, hardware development, artificial intelligence and many other fields.
History of computing
For a long time, in all corners of the world , Humanity was in search of a method that would allow it to symbolize and manage information so that it could be standardized and in this way any place and moment could be analyzed by people with the necessary knowledge, to consult it or modify it from acu erdo to the needs.
From the knots in ropes, through the clay tables to the first calculating machines Humanity always found the way to improve itself and its inventions, which derived, some years later, in which we could place a space probe on the surface of an asteroid to confirm what they are made of.
This path began when Humanity industrialized and its needs became more difficult to solve. From there to having one or more computers, smartphones, tablets and other surprising electronic devices in our own home, there was only one step.
To try to know how it developed In the history of computing, we have to understand that it all started with the need for humanity to manipulate and remember information in a symbolic way.
Although we can ensure that computing was born in the mid-1940s, the truth is that to be a little more rigorous, we must mention that computing has at least four stages of development.
The first act of this kind of historical play takes place in the ancient world , when the wise men of that moment eat They began to perform calculations with the help of common objects in their environment such as stones. At this point it should be noted that the word “calculi” has its origin in the Latin term “Calculi”, that is “pebble”.
From there it evolved until it reached the abacuses and calculation boards, elements that can still be used without problems to make calculations of all kinds.
The second act of the play is perhaps the most important of all. Since it should be considered as the point where computing really started its way: The creation of ENIAC a device in the military field capable of tracing the direction of a projectile without having to carry out the relevant field tests.
ENIAC, which stands for Electronic Numerical Integrator and Computer, whose Spanish translation is “Electronic Numerical Integrator and Calculator”, was released to the public in February 1946 and that it had approximately 18,000 valves in its electronics.
At this point there are certain discrepancies, since there are certain Investigators who argue that prior to ENIAC, the English already had “Colossus” in service, of which several units were manufactured during World War II to help decipher German codes.
In this sense, it should be noted that there was also a predecessor to both machines, although it never came into operation. This was a device capable of performing calculations at high speeds by means of vacuum valves, developed by John V. Atanasoff and Clifford Berry at Iowa University in 1942.
The third act of this work occurs in the late 1970s, precisely in 1977, when Steve Jobs and Steve Wozniak presented the Apple II to the world. Why should we consider this fact as a turning point in this story?
Although first on the list is Altair 8800, for launching first to the market, it was really impractical to use, which made it a device for people “who knew”, but managed to give the initial push for the computer world to spread quickly.
What really changed the story is the Apple II, which like the later IBM PC managed to displace computing from halls from universities and large companies to living rooms from homes around the globe, and could be used by any type of user regardless of their knowledge or ability.
To understand this fascinating subject with more depth we invite you to read the to Article “Computer Generations”
Annex: The first personal computer
The third act in this story it is really important, since it was the moment in which computer science went from being something academic to becoming a device that could coexist with television and radio, that is, being an integral part of household appliances from any home.
This story begins in 1975, when a company known as MITS released a computer kit that could be assembled by anyone with the necessary knowledge. The main feature of this kit was that it cost about $ 400, considerably less than the tens of thousands of computers that could be found on the market at the time.
In addition to the price, it offered many other advantages, such as a floppy drive for storing data and programs, a version of BASIC for creating programs by computer users developed by a team led by Bill Gates, then Altair developer and some other very important advantages for the time.
Despite all these favorable characteristics, the Altair’s biggest problem was that it was not designed to be used by any user. To use it, a series of knowledge in programming, electronics, mathematics and other sciences was necessary.
Fortunately this changed in the late from 1970, precisely in the year 1977 when the Apple II appeared on the market, a computer really easy to use and install by any type of person.
However, the real impulse that led to personal computers to the place of preference where they are currently, was an application called Visicalc, which allowed working with incredible speed with rows and columns as if they were spreadsheets, a concept known to accountants around the world.
Arguably, this combination, the Apple II and Visicalc, were the v True origin of personal computing, and was also responsible for turning the scenario known until then, since after that the software developers came to the forefront, leaving behind the prominence of large companies that manufacture devices like IBM.
The best example of this can be found in Microsoft and Bill Gates, who have been leading the way in computing worldwide for years.
Of course, the aforementioned IBM was not going to be left out of this fabulous business, and soon it launched its own computer in 1981, the PC or “Personal Computer”, which offered a series of advantages that made it interesting for other manufacturers and hardware developers.
The Personal Computer or for its translation into Spanish “Computa dora personal ”, was designed around an open architecture, which enabled other companies in the IT sector to offer peripherals, expansion boards and software perfectly compatible with the IBM PC. A very advanced processor was also used, allowing it to use much more memory than any of its rivals on the market. It also included a spreadsheet called Lotus 1-2-3 and an operating system developed by Microsoft.
In this type of competitions, who is left behind always loses, and for this reason, invested a great deal of resources and effort to launch the Apple Macintosh in 1984, which was the first personal computer and of any other type to include a graphical interface and pointing device, called a mouse, concepts borrowed from research conducted in the 1960s at Xerox Labs in Palo Alto, California.
Apple took this concept a bit further and introduced it to the general public, and thereby achieved great commercial success. This same concept was applied by Microsoft to its new operating system, called Windows . At this point the bifurcation between architectures and operating systems was already clear, which would carry over until decades later.
The fourth, and for now the last act in the history of computing, could be considered the emergence of the Internet as a method of communication. The Internet was born within the United States Department of Defense and the agency dependent on the Pentagon Advance Research Projects Agency ARPA.
From the tests and protocols ordered by these agencies, in times as remote as 1960, the Internet was born. From that incipient communication between two nodes in 1969 extends the network of networks that we all know today.
So such as personal computers became popular for everyone in the 1980s, relegating everything related to the management of large volumes of data to large computers and mainframes, since they could not compete with the price of PCs and their applications and ease of use, the ability to communicate two computers and allow information to be shared between them regardless of distance would also become routine with the passage of just a few years.
Annex: The birth of Ethernet and the Internet
Although in commercial activities that required managing large amounts of data, computers had already started to be incorporated, the truth is that to be able to handle all this data flow, these computers were huge, they required custom-designed software, and they were also incredibly expensive.
This was when began to think about the incorporation of smaller but interconnected computers. This resulted in the development, again by the Xerox Corporation, of a wired computer interconnection system called “Ethernet” named after the beliefs of ancient physicists who held that light it was transmitted by means of the Ether.
This computer interconnection system allowed multiple devices to connect to each other regardless of their location within a building, with which it was possible to share among them resources such as memory, printers and other peripherals, in addition to providing one of the most useful tools of all time: email.
But also led to the birth of a much larger network, the Internet. Simultaneously with the birth of Ethernet, thanks to the investigations carried out by various organizations and the Advanced Research Projects Agency (ARPA) an attempt was made to do the same but with computers that were geographically much further apart.
Originally, these investigations had the objective of developing a method so that in the event of war, communications could continue to operate, even in the face of a nuclear disaster scenario, since up to now communication networks were centralized, and as much as the buildings containing the nodes and the control electronics were well secured underground and in concrete structures, they could always be reached and destroyed, completely dismantling any attempt at communication between them.
That is why a system was necessary decentralized communication network ma . And that is why the aforementioned ARPA began to finance investigations around this type of model.
This led to the implementation of a communication system in which the information was divided into packets, each with a specific address to reach a receiving computer. This allowed that if one or more nodes on the network were not up and running, the system could always find another way to deliver the corresponding packet. Once all the packages were received, the receiving computer rearmed the packages in the original document.
Although it had been designed for other purposes, the truth is that specialists and scientists began to use the system to send and receive short messages, which was quite cumbersome.
This was until 1973, when Ray Tomlinson changed history forever. In order to send messages more efficiently and orderly between computers, placed an at sign “@” between the name of the recipient of the message and the name of the computer that was sending it.
The simple reason why the symbol “@” was chosen for this is due It was one of the few symbols other than letters on the keyboard of the ARPANET teletype input device at the time. We are facing the birth of email as we know it today.
Returning to the beginnings of the Internet, some Years later permission was granted to the National Science Foundation (NSF), a state-funded civil agency to expand the network, allowing other networks to interconnect with ARPANET. The name used by the researchers to call this network union was the Internet.
Finally in 1983, for the standardization of all modes of data transmission, a protocol called TCP / IP was adopted, which laid the foundations of the network of networks and is the protocol used today. With these changes in the incipient Internet, new models of computers, called workstations, also appeared on the market, which were much better equipped for networking than a standard computer, being equipped with hardware specific and operating systems for network management, such as UNIX.
But the landscape of that Internet really changed in 1991, when World Wide Web was presented in society, by the hand of Tim Berners-Lee, which basically was a series of protocols operating on the Internet’s own protocols and which allowed have a simple way to access the various contents of the network, regardless of the type of computer, device, operating system or protocol used to connect. With this set of guidelines, the first web browser also appeared.
Of course This was the spark that made the Internet expand to what it is today, with all the services it is capable of providing, such as online storage, various cloud apps, and more.
Tim Berners-Lee, in addition to being the developer of the World Wide Web, was also responsible for the development of a small application called a browser, that allowed access to Internet content in a simple way from any personal computer.
In this sense, although the creation of Berners -Lee was overcome in a short time, laid the foundations for the development of another na avenger with a little more history: Mosaic, created by engineers from the University of Illinois in 1993.
This rudimentary browser gave way to Netscape Navigator, which was indeed the decisive leap for users of computers will begin to explore the web. Finally Netscape Navigator was sold to Microsoft, which finally turned them into Internet Explorer. Then came the other browsers and Internet services. The story continues and continues to move forward.
There is no doubt that computer science is nowadays one of the most interesting bets that can be considered to tackle a career. Computing extends to many areas where it was not even feasible to install a computer before, such as inside a vehicle or as a refrigerator control unit.
This means that the application of the Computer engineering is not limited to the field of computers or their hardware or software. It can be implemented in virtually any scenario where data is required to be processed efficiently, quickly and accurately.
Basically, computer engineering or computer engineering as it is also called in other parts of the world, is a branch of engineering with which the foundations of many other sciences can be applied together, such as engineering for example. telecommunications, electronic engineering, software engineering and of course computer science, among other sciences, in order to provide data processing solutions autonomously and comprehensively from the point of view of computing and communications.
To apply computer engineering to any process, this branch of engineering must encompass certain theoretical faculties, which will ultimately be those that allow studying the problem from the root and from there find the implementation that best suits the needs of that problem. in particular.
In order to successfully tackle any scenario, a computer engineer must have knowledge of implementing data and information processing in telecommunications, which will allow the calculation and development, for example, of communication networks that fall within the parameters of security and legality.
It is also necessary for the computer engineer to have a deep knowledge of robotics, artificial intelligence, algorithms and of course the largest number of programming languages in order to successfully carry out any development where this type of implementation is required.
In addition, you must have knowledge of software development and maintenance, as well as the aspects derived from its possible commercialization. The computer engineer must also have extensive knowledge in aspects such as industrial and business organization, in order to put into practice the planning, management and control of computer projects in IT departments.
Likewise, the computer engineer must possess practical and theoretical knowledge about electronics, in order to be able to develop, design and physically build control interfaces and other mechanisms to complement the software developments that he may have created.
There is no doubt that computer engineering today is one of the most challenging careers that can exist for any student, since for its successful development, all the sciences that comprise it must be studied and understood, which results in a truly overwhelming task. , but undoubtedly computer engineering is one of the careers that in a few years will be essential to continue with our lives surrounded by sophisticated electronic inventions.
There is a great discussion about the definition of educational informatics and about the role that the computer should play in educational institutions such as schools or colleges. Depending on the educational vision and the technical / pedagogical conditions, this term can assume different meanings.
It could be said that educational informatics means «the insertion of the computer in the teaching-learning process of the curricular contents of all levels and modalities of education. The subjects of a certain curricular discipline are developed with the help of a computer. »
Finally, educational informatics will be in charge of teaching the student about computational qualifications, where he is trained in commercial applications. We can say that it is not enough to have technical knowledge and to know in depth the components of the computer, or to know how to program with different languages. There are several other aspects that must be considered in this process. The most important thing is to be aware of the implications of computers in society.
Computing and the computer
As we have already mentioned, computing uses two fundamental platforms, on the one hand software, which are all those programs, operating systems and others that allow you to carry out specific tasks, so they are actually the hardware’s software, the which is basically the physical support, that is, all the tangible components of a computer. For this reason, computers from their beginnings have been the necessary tool for the existence of computing.
According to the data collected from history, the first computer was the so-called Z3, designed by Konrad Zuse in 1941, and which is considered the pioneer in the field of automatic programmable machines.
The truth is that from that moment we must make a qualitative leap in terms of the conception of this type of machine, and locate ourselves in the early 1980s, at which time the first personal computer, also known as the acronym PC, which had been developed by IBM in collaboration with two geniuses from the digital world: Bill Gates and Paul Allen.
From there, with the passing of the years and the deepening of research related to the miniaturization of electronics, the development and advancement of the Internet, and the increasing deployment of software, we come to the present day where computing is an everyday element of our life, transcending all borders.