Human Computer Network

The Human Network

Computers are only part of what a network is. For a network to produce results of any kind requires a human being to either build it, operate it or analyze the results it produces. For much of computer history, a person would sit at a terminal and clack away at a keyboard to make the computer, and the connected network, perform some task. Then that person would analyze the results on a monitor, or in olden days, on punch cards or green-bar paper. In any case, it was humans on one side and computer networks on the others.

This is changing in two directions.

The Internet of Things To Come

The first is via the Internet of Things (IoT). Cisco Systems estimates that in 2013, there were 13 billion devices connected to the Internet. By 2020, that number will balloon to 50 billion (Goodman, 2015).

The promise of the networked IoT is that, for example, your alarm clock checks traffic conditions for your commute from, say, your home in Maryland to your job in Manassas, Virginia. It will then adjust the time allowing you more or less sleep based on its findings. Once your clock awakens you, it will have already connected with other parts of the house, turning on lights, brewing the morning Joe, letting the mutt out through the IoT pet door, and even finding your car keys that were left in the bathroom (Goodman, 2015).

One challenge to the burgeoning IoT world is data storage. As all these devices communicate and produce histories and other forms of data, it will need to be kept somewhere. In many cases, it is unclear whether it is even worth keeping. But, who knows, some day you might want to know how many times the refrigerator door was opened on Mondays versus Tuesdays. There are tools that can now aggregate data, but as sensors are added to existing devices, big data will just get bigger and, possibly, unmanageable (Matchett, n.d.)

I’m Only Human

The other direction involves humans interacting more intimately with networks, not just devices interacting with other devices. One way to accomplish this is with wearable tech. MIT’s Pranav Mistry’s SixthSense is a “wearable gestural interface” that allows the human to project a computer onto surfaces and control the functions with finger movements (Brodkin, 2010).

Emotiv’s Tan Le demonstrated a brainwave-control device that has the human wear a headset and control the computer with brainwaves (Brodkin, 2010).

Researchers at Brown University are developing a Brain Computer Interface (BCI), that has shown success with test animal subjects. One hundred electrodes transmit at 24Mbps to a nearby receiver (Anthony, 2013).

As data continues to accumulate it needs to be stored somewhere. One logical place is in the human brain. Comparing the data capacity of the human brain to conventional storage devices is far from an exact science. A commonly accepted figure is 100 terabytes, but it could be as high as 2.5 petabytes. Nobody really knows for sure (Gonzalez, 2013).

With more than seven billion people on Earth, there would be a great deal of data. Mathematically, that would be 1015 X 7,000,000,000 (“Population, 2015”).

However, with a life expectancy of 78 years, the data storage centers will need constant replacement. Although replacements are not a problem, since humans reproduce, a method of transferring data from the outdated units to the new models will need to be developed.

“Brain and Brain.”

Spock's Brain

Spock’s Brain

This is not as far-fetched as it might seem. Although it seemed rather outrageous when Spock’s brain was removed to run a computer in Star Trek’s “Spock’s Brain” episode, researchers at the University of Washington were able to send brainwaves (brain kept inside head) via the Internet (Skype, actually) to a colleague and have that person’s hand move on a keyboard (Farber, 2013).

“Brain and brain. What is brain?”—Kara, Spock’s Brain.

The answer to that question is a vast, untapped data center that could complete and augment the network of humanity, where humans and computer networks are interchangeable. The possibilities are intriguing and they are scary at the same time.

References

Anthony, S. (2013, March 4). Brown University Creates First Wireless Implanted Brain-Computer Interface. Retrieved March 12, 2015 from http://www.extremetech.com/extreme/149879-brown-university-creates-first-wireless-implanted-brain-computer-interface

Brodkin, J. (2010, September 2). The Future of Human-Computer Interaction. Retrieved March 12, 2015 from http://www.networkworld.com/article/2217838/virtualization/the-future-of-human-computer-interaction.html

Farber, D. (2013, August 27). Scientist Controls Colleague’s Hand in First Human Brain-To-Brain Interface. Retrieved March 12, 2015 from http://www.cnet.com/news/scientist-controls-colleagues-hand-in-first-human-brain-to-brain-interface/

Gonzalez, R. (2013, May 24). If Your Brain were a Computer, How Much Storage Space Would It Have? Retrieved March 12, 2015 from http://io9.com/if-your-brain-were-a-computer-how-much-storage-space-w-509687776

Goodman, M. (2015, March 11). Hacked Dog, a Car that Snoops on You and a Fridge Full of Adverts: The Perils of the Internet of Things. Retrieved March 12, 2015 from http://www.theguardian.com/technology/2015/mar/11/internet-of-things-hacked-online-perils-future

Matchett, M. (n.d.). Internet of Things will Boost Data. Retrieved March 12, 2015 from http://searchstorage.techtarget.com/opinion/Internet-of-Things-data-will-boost-storage

Population. (2015, February 5). World Bank. Retrieved March 12, 2015 from http://www.google.com/publicdata/explore?ds=d5bncppjof8f9_&met_y=sp_pop_totl&hl=en&dl=en

Jeff Macharyas
MS-Cybersecurity and Computer Forensics
Utica College

Response from my professor:

Great post!!
This is exactly the kind of thinking that moves the field forward.