Technobabble is our look at the more colorful aspects of technology and the tech industry. Be sure to check out our last edition about spaces vs. tabs.
What's more American than Apple Pie? Technological innovation (though some could argue Budweiser and the National Football League).
The United States has long been the home to advancements in technology, thanks to long-standing companies like IBM, founded in 1911, and Intel, founded in 1968. With the emergence of Silicon Valley as a tech powerhouse, the U.S. has continued to solidify its international standings as the center for technology, whether that's mobile solutions or cloud offerings.
In honor of the Fourth of July holiday weekend, here's a look back at three technologies created in the U.S. that have shaped the course of modern computing:
The iPhone
The first iPhone, which debuted in 2007, looks as clunky as the car phones of yesteryear when compared to smartphones of today. But the iPhone changed the course of mobile phones. Instead of the Nokia brick phones that were prominent in the early 2000s or the trend of making phones very small and still functional, Apple made a pocket-sized personal computer.
Its innovation sparked the rise of the app economy and forever changed how people accessed information forever. So here's to 10 years of the iPhone and a decade of never having to wait to learn the answers of burning questions. Siri, how long would it take me to walk to the moon and back?
The cloud
Sure, Amazon wasn't the first to virtualize computing, but it was certainly the first the make it so widely popular. Founded in 2006, Amazon Web Service made computing storage on-demand, taking over the business world in the process.
AWS is the king of cloud computing, considered the leader of the IaaS market by Gartner. AWS' work has changed how companies approach storage, allowing for applications that draw on the cloud rather than taking up space on devices.
For companies looking to make a name in the cloud space, there is no longer room to compete to dominate the market. With the pace of storage innovation from AWS, competitors are better suited vying for a niche role as specialized storage providers.
The internet
Sure, this one is a bit cheeky, considering that the U.S. alone cannot take credit for the internet. But it kind of can. Technology today would look a lot different if the internet had not reached maturity.
Created in the 1960s, the U.S. Department of Defense-funded ARPANET was the first "workable prototype" of the internet, according to History.com. The early version of the internet allowed multiple computers to talk across a single network. Once the World Wide Web was created in 1990, the course of history changed, as it provided a way to access data online.
Of course there are security concerns today and the issue of archeological layers of technology that inhibit modernization efforts, but those are minor problems when compared to what the internet has made possible. After all, the global economy would crash if the internet was suddenly taken away. "
Products have become digital first, digital always and the internet has allowed a digital record of humanity to emerge, set to preserve history without fear of records breaking down until they are no more than dust.
One macro thing
The mere presence of smartphones in our lives is making us dumber to the point that science suggests smartphones are distracting enough to inhibit learning.
In a recent report, the Journal of the Association for Consumer Research found that fluid intelligence, or "ability to reason and solve novel problems, independent of any contributions from acquired skills," is being threatened by just the thought of our phones.
In other words, our capacity to retain information and to adapt to new skills is being replaced by the distraction of smartphones.
Most of us choose to distract ourselves with our devices. Many may assume that quickly Snapchatting a selfie with the dog filter is a harmless 20 seconds taken from our day. However, within that 20 seconds, you put your brain in jeopardy. Those 20 seconds, your phone, just detracted a bit of your cognitive abilities.
A study at the University of Texas concluded that even when our phones are off, "your conscious mind isn't thinking about your smartphone, but that process — the process of requiring yourself to not think about something — uses up some of your limited cognitive resources. It's a brain drain."
When your fluid intelligence must combat the calling of a dog filter, whether it wins or loses, your brain just sacrificed available capacity to learn and develop fluid intelligence. Was the dog filter worth it?
One micro thing
If we learned anything from Indiana Jones, it's that immortality is not meant for humans; intelligence, however, is. Humans cannot live forever but our thoughts, in the form of artifacts and technology, can outlive mankind.
Enter artificial intelligence, the Holy Grail of immortal thought.
CNBC reports the case IBM made on Capitol Hill this week, arguing that companies skeptical of integrating new technology could risk being left behind and that digital cognition, or AI, is a way of bettering mankind, not dismantling it. Big Blue argued job loss should not be a factor in limiting AI expansion. Jobs can be streamlined, automated and consequently created to assist advancements in AI, supporting humanity.
We own AI just as much as we own our thoughts as AI is programmed from our thoughts and preferences. So much so, the machines are not free-thinking entities or dark Overlords.
The main difference between mankind and AI is found in the capacity of their reach. In that regard, we are inferior but we are also benefited. The value is evident in the use of drones, Unilever's automated hiring for HR and even cancer research.
Now, those fearful of a doomsday, AI Overlord must remember that the "Holy Grail cannot pass the Great Seal" and we have not yet passed the seal of sentient systems. Elon Musk, however, has been vocal of his ironic fear of AI, telling Vanity Fair we could "produce something evil by accident."
One last thing
ZDNet reports that between 2016 and 2019, there will be an estimated 31 million robots in our homes, designed to carry out basic household functions.
In other words, everyone born prior to 2016 may feel a sense of resentment to emerging generations who won't know what a household chore is.
Misty Robotics' CEO, Tim Enwall, expects their domestic robots to be used for "watering the plants, playing with cat, reading to your child, keeping an eye on your house and even keeping us safe." He expects his company will become the Apple of the domestic robotic innovation.
Time will tell not only how functional domestic robots will be, but also how accepting the public will be coinciding with AI skepticism.
We are already accustomed to robotics serving up information. How long will it take for us to grow accustomed to robotics serving our every whim? Hopefully after most post-2016 babies learn a few essential household chores (that dishwasher won't unload itself ... yet).