Are We Too Dependent on Computers?

A computer has been one of mankind’s greatest invention among other inventions ever since the foundation of science began. More »

Robotic Human Vs Humanoid Robots

The term “Robotic Human” is used to signify the impact of mobile devices and wearable computing on users. These devices help to improve personal productivity, to the extent that users start to behave like machines. More »

What Will 3D Animation Be Like in Coming Years?

chances are good you’ll start seeing it in many other places in the not too distant future. More »

Which Is the Best Video Editing Software?

Whether you have made a movie, a promotional video, or a music video, you likely know that you need tight editing to put it all together properly. More »

 

Are We Too Dependent on Computers?

A computer has been one of mankind’s greatest invention among other inventions ever since the foundation of science began. Its development was a result of years and years of long experiments spanning a hundred or so years conducted not just by one man, but many. Development of computers as it is today is a continuous process and it will ever be. Computers, however simple they may seem now to the computer literate, has a complex set of system underneath. It takes multiple disciplines in both computer studies and electronics to fully understand them. After all, computer in itself is subdivided into branches as is science itself.

While other technological inventions may have had already been developed prior to the foundation of science, “technology” is not yet a proper term for such. The word technology, after all, is always correlated with science and both science and technology are mutually inclusive to one another, strictly speaking in terminologies. Computers of today, however advanced they may seem, have had its origins in humble beginnings.

How did computer began?

Abacus, the earliest form of calculator, has been recorded to be in use since the early civilizations estimated to be around 1000 and 500 B.C., only to be adopted elsewhere in the world. The idea on how the algorithm of a computer does its arithmetic was based on this, in logic. Soon after, for as early as 1820′s, in the personification of Charles Babbage, dubbed to be one of the fathers of modern computer, developed ideas on how computers should do its math, initially known as the difference engine, it developed later after to become what is known as the analytical engine. While Charles Babbage, due to funding issues, didn’t get to see his ideas into fruition during his lifetime, it is his youngest son, Henry Babbage, who did so in 1910 based on his. However, this primitive form of computer is not as advanced as how we see on computers of today.

The idea of the need to do the computation on our behalf as man, hence the word ‘computer,’ came out of the need to handle complex problems and perform complex computations that is both difficult and takes longer time for man to handle. Especially true during the times of the industrialization era and great world war where the need for such arose. How a computer behaves is what’s in a library of a computer.

The development of computer grew by a lot since laying the foundation by Charles Babbage as was inspired by existing “technologies” of its time. From names of people of the past significant in the foundation of computers such as Ada Lovelace, Konrad Zuse, Alan Turing, John Atanasoff & Clifford Berry, Howard Aiken & Grace Hopper, so on and so forth, up to the present computer giant names such as William Gates, Steve Wozniak, and Steve Jobs, among others, computers of today are bigger in functions than they are their sizes and have found a spot in every people’s lives in both commercial and personal usage.

How do people use computers in their daily lives?

Modern day computers laid out the foundation on how we perform duties of today. It is a lot more efficient and makes the work done in shorter times. From a simple household leisure such is playing games or running multimedia programs, to doing office works, to a more difficult developing programs, up to a more complex computations such is done in NASA, computers made all this possible — all in a single box. What once takes a long time to finish in groups as in seen in companies without computers, can now be finished in shorter times with those.

Computers taking over the world

Also one of the most popular usage of computers is the internet. What once was the trend for telephones and telegrams, has become internet’s – and it is worldwide. Literally, computers taking over the world.
Although initially used for military purposes concurrent to the development of the computer, the internet grew up to become commercialized as it is used today. The internet, in conjunction to and other than the previous means, made communication around the world possible which also gave rise to communication tools such as the social media. To date, billions of people use computers with the internet every day.

Are we really too dependent on computers?

We may be are dependent on computers in relation to the internet given the information age we are in, since the computer age began. But such dependency was initially for a good intention – that is, to keep up with the demands of progress through the efficiency and rate the work demanded is done with computers as both our aid and tools. Let’s face it, there are complex works out there that can only be efficiently done if and only if we have computers. However, one should also ask whether such dependency is good for us. What if, by some chance, this technology called computers and the things it can do were taken from us? What then? Just like a bad addiction, computer dependency outside of our needs and moderation can be harmful for us, its users. An ideal tool for tyrants. While it may sound like it is out of context, it is not. In fact, we are as capable workers as our ancestors were without computers. Although obviously at a cost of the efficiency and ease we’ve known on how we do things with computers. These are not statements about abandoning computers as we know and use them, we’re just waking up with the idea of who we are without computers and who are ancestors were without those. No, we are not worthless without computers – we are not just as capable as someone who does and we alone are not capable as what a complex computer does as a computing machine. But do not be fooled, we are still the ones who made computers as they are today. In a way, we as a mankind is still superior to machines.

Robotic Human Vs Humanoid Robots

The term “Robotic Human” is used to signify the impact of mobile devices and wearable computing on users. These devices help to improve personal productivity, to the extent that users start to behave like machines.

The term “Humanoid Robot” is used to signify the intelligent machines, who can imitate the behavior of living beings.

In the current era of evolution, the Robotic Human has taken a lead over the Humanoid Robot. This is enabled by the success of emerging technologies in a consumer domain over the enterprise world.

It can be attributed to the different mindsets of productivity improvement and cost reduction. While the consumer applications were looking for improving personal productivity, the enterprises were looking for opportunities to reduce labor cost. This might be a big factor in disruptive success of consumer technologies. The political pressure of job loss is adding to resistance to change. Even now, enterprises are looking to exploit the improved productivity of employees. Instead, they should try to look for overhauling business processes and considering evolving relationship between man and machine to improve business efficiency in terms of reach and revenue.

The metamorphosis of the virtual world due to the growth in mobile connectivity and social networking interactions, is set to change the human behavior forever. This dynamism is likely to be continued, as wearable devices and virtual reality applications will diminish the gap between brick-and-mortar reality and online world. Applications for augmented reality and context aware computing will expand the level of human productivity to a level that, they will start to resemble machines. The communication channels of e-mail on desktop, and landline phones were not intrusive. But the always available nature of mobile devices has been developing a compulsive behavior in users. The peer pressure is forcing them to respond in real-time. If the side-effects of stress can be controlled, the increase in productivity can create robotic humans.

It is almost a coincidence, that the rise in human productivity has resembled a decline in focus for developing robots. Most of the human productivity applications have developed in consumer world, where developers were able to recover investments quickly, by shortening product development cycles, and running attractive promotions.

On the contrary, the developers of robots were facing rising capital spends for a long time. The acute financial pressure of economic uncertainty forced many of them to scale down investment in the development of humanoid robots. Even the end-user corporations are unable to fund the infrastructure modernization projects for increasing automation. Among other factors, the low-cost offshore labor with less productivity was considered more economical than the highly productive, costly machines.

The balance of power seems to have shifted back to humans. Humans are able to improve personal productivity using machines, instead of being replaced by them. Enterprises have bowed to consumer technologies, and allowed B-Y-O-D (Bring Your Own Device), Cloud and Social media to leverage the improved employee productivity and consumer engagement. This has made employees and consumers more valuable.

Looking ahead, the technology applications that will try to increase automation, at the expense of human labor, are likely to face resistance in enterprise domain. Whereas, applications that will strive to improve human productivity are expected to succeed. As employees and consumers will own these applications at their own expense, enterprises will frame policies and customize operations to take advantage of them. Many of them are revising rules and processes to take advantage of employee-owned tablets. If concerns for privacy and secrecy can be addressed, enterprises are getting a more productive employee, who is not dependent on enterprise funds for acquiring gadgets, gizmos and machines to improve personal productivity.

Enterprises should not be afraid of losing control of information infrastructure due to the intrusion of consumer devices. They should be willing to capitalize on the improved productivity of employees. This needs to be communicated properly as employee empowerment by allowing to work from personal devices.

Enterprises should not be afraid of losing control of information infrastructure due to the intrusion of consumer devices. They should be willing to capitalize on the improved productivity of employees. This needs to be communicated properly as employee empowerment by allowing to work from personal devices.