Last weekend I attended the Shape Expo (website here) in San Francisco, and my experience there surprised me. I read a fair amount of tech news and was already somewhat familiar, to some degree, with most of the the new tech: virtual reality, object recognition, artificial intelligence, robotics, holograms and more. Still there were many "wow" moments seeing the exhibits, but I found that the biggest "wow" was one building in accumulation, and it was well articulated in a panel discussion on Saturday afternoon entitled "The Networked Future."
One of the most interesting points of that presentation was the fact that the large networking companies (AT&T, Verizon) have surpassed Moore's law over the past five years. (Moore's law is that an emerging technology's advancement will double capacity every 18 months.) This has been happening in both network speed and capacity, and the trend is so pronounced that young developers and students have pretty much abandoned one of the main tenets of software developers from the last century: namely, programming within resources.
Today's app developers don't worry about bandwidth or RAM speed. They just assume that the machines will keep up with anything they can do, and in this era, they're right. For those of us who have spend much of our professional lives surveying data to see where we might use a 4 bit integer instead of an 8 bit, this can be a bit disconcerting, but it's also liberating.
Another interesting point was that the public is about to see and use internet in very new and different ways. Although it's generally understood that the internet is here to stay, and that its role in our culture has grown ubiquitous and permanent, what's less understood is that it's morphing from just an information conveyance - which it will continue to be - to a physical and situational control network. It will help you control your air conditioning. It will tell you if your teenager has been speeding. And if you manage a business, it will provide you with more actionable information per hour than you'd ever thought possible - far more than you can utilize - and it will make good decisions based on that information.
Back in the 1990s, one of my first jobs as a developer was for a software company that made and supported a logistics software for the public warehousing industry. The software managed customers, inventory intake, storage, location, billing and shipping to the point of optimizing multi-customer truck loads for best efficiency. What it did not do was talk to truck drivers, scrutinize their routes, stops or driving behavior in the truck. But now that company is very likely to be considering products and services like IMBOT, that come with hardware and apps that can relay this kind of information, and very soon, secure APIs to let developers integrate services with larger management software. So it soon may be possible for "smart" software to oblige a request for something like a last-minute emergency product shipment by optimally diverting a nearby route, updating the driver's delivery schedule with a new timetable, and making any secondary updates as needed - all without any human having to dig for time-sensitive information that would almost always be incomplete, and making a decision that may become problematic because of incomplete information and unintended consequences. A computer is really much better at logistics, especially where complexity and/or speed is an issue.
This will change everything for business and supply chains, and it will offer small companies the same information advantages as large shipping fleets.