Archive for the ‘Software, Technology, and Wow I Didn't Know That’ Category

going-viralEvery blogger, web poster and Twitter or Facebook aficionado dreams of the day one of their posts goes viral.  (We wrote of our own brief brush with viral fame back in 2010 with a post on multi-tasking that reached  2,500 hits – still trivial by internet viral-ity standards.)

But as it turns out, going viral is usually not what it appears to be.  When we see a Facebook post with thousands of shares, or a YouTube video with millions of hits, we’re inclined to think that’s the result of countless ‘personal’ shares, “like infected individuals passing along the flu,” as Derek Thomson, author of Hit Makers: The Science of Popularity in an Age of Distraction, noted in a recent article for Time.

Apparently, for a long time no one really knew whether a popular idea or product was truly viral.  It was hard to track precise word-of-mouth buzz.  But online, scientists have been busily studying information as it pings around the Internet, and in 2012 researchers at Yahoo studied the spread of messages on Twitter.

Their conclusion: nothing really ever goes viral.

Thompson notes that more than 90% of messages did not ‘diffuse’ at all.  About 95% of what we see on Twitter comes directly from its source, or from one degree of separation.

Thompson writes: “Popularity on the Internet is still driven by the biggest broadcasters – not by a million 1-to-1 shares, but rather by a handful of 1-to-1 million shares.”  They come from a wide range of blast points –not just the legacy TV companies like NBC, CBS or Fox, but from places like Reddit or, yes, the Kardashians.

While the viral myth is enchanting to writers because it feels so uplifting, promising small-time writers, photographers, artists and videographers a shot at widespread artistic fame, alas, it rings mostly false.  While it might feel good to think of the Internet as a bastion of democracy where anybody can become a star by making something interesting or good enough, it’s not really the case.

In the end, as Derek Thompson says, “virality is a David myth obscuring the fact that the Internet is still run by Goliaths.”


Read Full Post »

ai_picTwo recent articles, one from Christopher Sims of the Wall Street Journal, and another featuring IBM CEO Ginni Rometty (also writing for the Journal), provide glimpses into where Artificial Intelligence – AI – is likely to take us.  And from both, one conclusion is clear: it’s all in the data.

The end of the year is a great time to be thinking about the future.  And AI will increasingly be a part of everyone’s future.  The gist of the arguments from both Rometty and Sims make clear that data – big data – will be what makes AI truly possible.

While today’s newer smart assistants, like Alexa and Siri, are entering into our everyday lives, they represent only the beginning.  Already, Alphabet (Google), Amazon and Microsoft are making their AI smarts available to other businesses on a for-hire basis.  They can help you make a gadget or app respond to voice commands, for example.  They can even transcribe those conversations for you.  Add to that abilities like face recognition to identify objectionable content in images, and you begin to see how troves of data (in these cases, voice and image) are being transformed into usable function.

But all this data and technology, notes Sims, are not going to suddenly blossom into AI.  According to data scientist Angela Bassa, the real intelligence is still about ten years away.

Why?  Three obstacles:

  • Not enough data. Most companies simply don’t have enough data to do deep learning that can make much more than an incremental difference in company performance.  Customers are “more interested in analytics than in the incremental value that sophisticated AI-powered algorithms could provide.”’
  • Small differences generally cannot yet justify the expense of creating an AI system.
  • There is a scarcity of people to build these systems.

All that being said, Ms. Bassa, noting that there are only about 5,000 people in the world who can put together a real AI system, says that “creating systems that can be used for a variety of problems, and not just the narrow applications to which AI has been put so far, could take decades.”

IBM CEO Ginni Rometty notes that the term artificial intelligence was coined way back in 1955 to convey the concept of general intelligence: the notion that “all human cognition stems from one or more underlying algorithms and that by programming computers to think the same way, we could create autonomous systems modeled on the human brain.”  Other researchers took different approaches , working from the bottom up to find patterns in growing volumes of data, called IA, or Intelligence Augmentation.  Ironically, she notes, the methodology not modeled on the human brain led to the systems we now describe as ‘cognitive.’

Rometty concludes, fittingly, that “it will be the companies and the products that make the best use of data” that will be the winners in AI.  She goes on to say… “Data is the great new natural resource of our time, and cognitive systems are the only way to get value from all its volume, variety and velocity.”

She concludes with a noteworthy commentary: “Having ingested a fair amount of data myself, I offer this rule of thumb: If it’s digital today, it will be cognitive tomorrow.”

Read Full Post »

underseaSome of the very largest Internet content companies, including Google, Facebook and Microsoft are actively engaged in moving from routing data through traditional carriers to installing their own cables – under the ocean.  After all, the shortest distance between international networks is a straight line through the ocean, as was recently pointed out in a special “Tech-The Year Ahead” edition of Bloomberg BusinessWeek

Ships have been laying cable across the ocean floor since 1866, when the SS Great Eastern laid the first trans-Atlantic telegraphic cable.  That cable was capable of transmitting eight words per minute. Today’s fiber-optic trans-Atlantic cable can transmit one hundred terabytes a second.

Facebook and Microsoft have joined with a Spanish broadband provider to lay a private trans-Atlantic cable, divvying up its 8 fiber optic strands in the next year.  Other tech companies are engaged in similar effort, across both major oceans.  In effect, these companies are underwriting the next generation of capacity in an ever-growing demand for internet content and bandwidth.  Worldwide, Bloomberg notes, 33 cable projects are scheduled to be online by 2018.

In addition to more fiber cables, there are other ways to expand bandwidth around the world, and companies are actively pursuing those too, including satellites, lasers and drones, to name just three where Facebook has active projects.  Google has experimented with hot air balloons.

So far, undersea cables have proved the most cost effective means of providing bandwidth across oceans – they’re cheaper, reliable and largely unregulated.  Under United Nations jurisdiction, cable installers are pretty much able to lay cables wherever they please, as long as they don’t interfere with existing ones.

So there you have it: Silicon Valley pouring billions into technology pioneered in the telegraph era, in an all-out effort to take hold of our destiny.  And as Microsoft’s CTO Mark Russinovich notes, “We’re nowhere near done being built out.”

Read Full Post »

themachineWe’ve worked with PCs since the introduction of the original IBM Personal Computer, and one thing we know: the evolution of computing is never-ending.

Recently, Hewlett Packard Enterprise Co. reached a milestone in efforts to deliver a new kind of computer.  And its future business may be more driven by the components it developed in the process, than as a complete system.

According to the Wall Street Journal, HPE announced in November in London the working prototype of what it calls “the Machine.”  The Machine flips long-held conventional PC wisdom on its ear, in that the new product “leans heavily on memory technology to boost calculating speed.”  Programmers are just now starting to work on software needed to exploit the system, which is not expected to be widely available until 2018.

HPE’s near-term plan, according to Journal reporter Don Clark, is to use components developed for the system in its conventional server systems.  One component already in production is a chip called X1 and some related components intended to sharply lower the cost of sending data over speedy fiber-optic connections.  Engineers claim this could make data transmitted from one machine to another in a network as fast as data is currently transmitted within a single computer.

The technology would also permit vertical stacking of components to save space, breaking with yet another tradition (i.e., horizontal stacking).  In traditional computing, the focal point of system design is processors, with memory considered a scarce commodity.  With the Machine, HPE claims, hundreds of thousands of processors can simultaneously tap into data stored in a vast pool of memory contained in one server or multiple boxes in a data center.  Noted CEO Meg Whitman, “It’s breaking every rule.”

The net effect of the Machine is a one hundredfold increase in computational speed – immensely useful in large chores like helping airlines respond to disruptions, preventing credit card fraud and assessing the risk of vast financial portfolios.

And if there’s one thing we’ve learned in the tech arena, most of the innovations that first solve big problems in large scale systems eventually move down to our local distributed business networks, much to the benefit of small businesses across the world.


Read Full Post »

serverless-computingAs recently reported in the Wall Street Journal (10-27-16), expect the next wave of computing that will be widely adopted later this decade as the cloud grows more ubiquitous to be “serverless” computing.

The word is so new, spell-checkers don’t even know about it yet.  As GE’s CTO, Chris Drumgoole commented, it’s what the “cool kids” are thinking about.

Serverless computing is the logical evolution of physical and virtual servers, where the servers used to run applications become invisible to the developers building applications.  GE’s Drumgoole expects the technology to really take off next decade.

Serverless computing “allows developers to focus only on writing code without having to manage the servers” – essentially make the process “serverless” to the customer.

As the Journal’s Sara Castellanos describes it: “The provider runs the customer’s application on its own servers or inside “containers,” where they are broken into small pieces and placed into software shells, allowing the pieces to be distributed to any sort of device, in a digitally orchestrated manner, and at lower cost.”

Speculation is that customer IT departments might then “pay the provider every time their code is triggered, instead of doling out cash upfront for machines or virtual servers that they may not need.”  This is not unlike the way Amazon (via AWS) and Microsoft (via Azure) offer up their own cloud services to users today.

It’s the logical evolution begun a few years ago as physical servers began giving way to virtual ones.

Pioneers in the area today already include IBM, Alphabet (Google) and Amazon Web Services.  The AWS offering, called Lambda, allows software developers to write applications and upload their code to its service, where Lambda manages the code and runs it on AWS servers.  It executes that code at scale and bills for every millisecond the code is triggered, thus eliminating the need for customers to buy or maintain any physical servers.

It’s all part of the continuing shift to the cloud.  The Journal concludes by pointing out that “it is particularly useful when running applications for internet-connected devices such as Amazon’s Echo, because the apps require massive amounts of requests of short duration.”

Welcome to the future.


Read Full Post »

globalizationWe noted in our prior post the recent flattening in growth of the global trade of goods and services.  At the same time, we noted, according to Information Week, McKinsey and Trends eMagazine, among others, the global exchange of data is soaring, leading to a range of competitive advantages that we noted in the post.  Today, we look at what these same folks are saying about the impact this will have on companies, and why it matters to yours.  (Original source content can be found here.)

Following, then, are four key developments emerging from this trend:

  1. The successful companies of the next decade will be those that embrace digital globalization. McKinsey suggests 6 actions:
  • Reevaluate the need for physical locations when a website can be just as effective
  • Consider whether creating new offerings in different markets beats the same product in each
  • Decide whether to manufacture off-shore or locate close to the customer
  • Evaluate “monetizing assets” such as customer data to develop new products/services
  • Recognize the threat for industry disruption from competitors in emerging markets
  • Protect digital assets from hackers by keeping security software up to date; train employees
  1. Countries that impose limits on data flows will damage their own economies and deprive their citizens of the benefits of digital globalization.

Information Week notes that countries as diverse as Australia, Brazil and Greece have enacted policies that limit storage of data.  Over 100 nations have or are working on laws designed to prohibit personal data from moving across national borders – for varying reasons.  But, as Brookings Institution has found in a 2011 report, a 10% rise in a broadband penetration causes a 1.3% jump in economic growth.

  1. The Internet of Things will ignite an explosion of data flows. As this next state of the digital revolution unfolds, smarter devices, sensors, GPS and artificial intelligence “will revolutionize logistics, fleet maintenance, agriculture, healthcare and many other industries.”  According to Cisco, within 3 years, over 40% of data flows will be via machine-to-machine connections.
  2. The global trade in goods is unlikely to return to previous levels. American companies are re-shoring manufacturing operations back to the U.S.  As 3-D printing advances, companies “will create more products and intermediate goods as they are needed and where they will be used, rather than ordering from overseas suppliers.”

That’s the future of business and data globalization as defined by several of the world’s best prognosticators and subject matter experts.  While some of their advice is aimed at larger companies, industries, even countries… the lessons should not be lost on today’s small businesses.


Read Full Post »

globalizationAs we come to the end of September, 2016, there’s a lot of talk in this political silly season about the pros and cons of globalization, some of it regrettably uninformed, even while the underlying frustrations expressed by proponents of limiting globalization are both understandable and palpable.  So in our last two posts of this month, we take a quick look at why today’s business must embrace the digital future.

The growth of globalization, that is, the growth in the exchange of goods, services, people, money and data across national borders, has slowed down in recent years (this, according to the June, 2016 issue of Trends eMagazine).

This is not surprising given that the world has a finite number of people, limiting the number that can move from one country another.  And after all, we only need so many TVs, phones, t-shirts and pineapples.  There’s a limit to the number of goods that can be produced in one country and sold in another, note Trends’ editors.

But there is one type of exchange that is not finite: data.  According to Information Week, the amount of “online data exchanged across borders increased by a factor of 18 between 2005 and 2012.”  By 2025, it’s expected to increase another 700%.  And according to McKinsey all this data flow had a combined impact of over $2 trillion in value in 2014 – more than “global trade in products or foreign direct investment.”  And that cross-border data flow barely even existed in 2000.

Driving this trend of course is the enormous increase in bandwidth across borders, an increase of nearly 50 times in the past decade, and predicted (by McKinsey) to increase by another 900% in 5 years.  The benefits of these global data flows include:

  • Lower transactions costs: a customer in Europe downloads an e-book on Amazon and eliminates paper, shipping and wait times.
  • Global scope: The smallest company can now be an actor on the world stage, quickly and cheaply. There are 50 million companies with Facebook pages.
  • Empowerment of individuals: Free online courses… job postings… connections worldwide, etc.

In fact, McKinsey finds, global trade in goods, long the traditional creator of world wealth, has actually flattened in recent years.  That’s one reason companies are re-shoring their manufacturing — as higher labor costs in China, the need to respond more quickly to shifts in demand, and the opportunity to capture savings from shorter supply chains have all served to reduce cross-country trade lately.

Trends editors see all this as leading to four emerging developments that will be crucial to companies in the not so distant future.  We’ll take a look at all four in our next and concluding post on the revolution in global data exchange.  Stay tuned…

Read Full Post »

Older Posts »