Feeds:
Posts
Comments

Posts Tagged ‘cloud’

According to The Wall Street Journal (1/17/18), Google is expanding its already large network of undersea cables to access new regions around the world not currently well served by its competitors, as well as to give itself some rerouting capabilities if a region fails or gets overloaded.  It’s all part of the now unstoppable growth of cloud computing services, as well as a play to keep up with its two greatest competitors, Amazon and Microsoft.

Google VP Ben Treynor professes that he “would prefer not to have to be in the cable-building consortium business” but found there weren’t a lot of other options.

Google parent Alphabet already owns a massive of network of fiber optic cables and data centers, already handling about 25% of the world’s internet traffic.  These facilities allow Google to control its data-intensive software without having to rely on the bit telecommunications providers.

After a decade of construction, Google will soon have 11 underwater cables around the world.  They’re used to “refresh research results, move video files and serve cloud computing customers” around the globe.

And at that, Google currently ranks third in cloud-computing revenue behind Amazon and Microsoft in the biggest tech race going on the planet now.  Billions of dollars of revenue annually are at stake, the Journal points out, as companies increasingly move various data operations to the cloud.

Currently, its longest cable stretches 6,200 miles from Los Angeles to Chile.  But Google has also teamed up with others, like Facebook, for its latest build-out.  They plan to share capacity on a 4,500 mile cable from the east coast of the U.S. to Denmark, with an additional terminal in Ireland, thus increasing its bandwidth across the Atlantic.

Another cable of 2,400 miles will be run from Hong Kong to Guam, hooking up with cable systems from Australis, East Asia and North America.

The internet build-out continues, as does the march to cloud dominance.  Only today, you’ve got to have a few billion dollars in your pockets to play.

 

Read Full Post »

Cloud storage services have become big business.  And a handful of familiar names are quickly grabbing up ownership of most of it.

For starters, Amazon is by far the largest cloud provider today, garnering (according to Amazon Web Services CEO Andy Jassy) “several times as much business as the next 14 providers combined.”

Microsoft is next largest in terms of sales of the infrastructure services that store data and run applications.  But at last read, they were still less than one-fifth Amazon’s size.

And Google places third, even though they are now by market value the second-largest company in the world, though with only about one-fifteenth of Amazon’s cloud business revenues.

Still, Microsoft and Google aren’t standing still.  Microsoft’s cloud unit, called Azure, has won over some large customers lately including Bank of America and Chevron.  They are said (according to a recent article in Businessweek) to have done it by focusing on salesmanship and relationship building skills, something not necessarily the forte of the Amazon business model.  Microsoft CEO Satya Nadella has pushed his sales force into “a roving R&D lab and management consultancy.”  They’re hooking up smaller startups with potential investors and giving larger prospects access to a sales team that helps them market their Azure-based apps to their own customers.  Win-win.

Microsoft is also increasingly moving its traditional Office suite to the cloud via initiatives like Office 365 and the new Dynamics 365 products and branding.  This makes it more likely that when companies consider moving off their own data centers they’ll consider Microsoft favorably, exploiting that existing relationship when it comes to migrating to a public cloud.

To step up its game, Google recently hired the co-founder of VM Ware, Diane Greene, to run its cloud business, starting with a cloud sales force they are building from scratch.  Google also recently announced a partnership with Salesforce.com to take advantage of its list of preferred cloud providers, according to Businessweek.

One big advantage both Google and Microsoft will try to exploit over Amazon is the fact that Amazon often competes fiercely with many of its own prospective cloud clients.  Wal-Mart and others are not keen on seeing their AWS payments benefit the very retailer they most compete against.

It’s still too early to say who will end up on top, but the battle is fierce, and you can expect all three of these tech titans to be in that mix for years to come.  It’s already a $35 billion market that’s projected to grow to about $90 billion within four years according to Gartner analysts.

As AWS’s Jassy notes, “This is the biggest technology shift of our lifetimes.”

 

Read Full Post »

cloud confusionThe cloud is just another computer somewhere else.

That clever little phrase pretty well embodies What Cloud Is as well as anything you’ll probably read.

But, it turns out, assessing just how well and how fast cloud computing is growing is a bit more problematical.  This comes to a light in a February 2, 2016 article by Angus Loten and Kimberly Johnson of the Wall Street Journal.

It seems these days that some very large cloud providers, including Microsoft, IBM, Oracle and others have been a little fuzzy about their numbers and what they classify as “cloud” services.

Gartner researchers expect 17% growth in cloud services to $204 billion this year.  But some providers use the cloud term in marketing and sales pitches, as well as earnings reports, to include a very wide range of services, according to senior tech executives, industry analysts and the Journal’s reporting.

As the CIO of General Electric noted, “Vendors are constantly showing up and trying to sell you everything as cloud.”  It takes “extra work” to determine what are and aren’t true cloud services that will be around for the long haul.  That’s a growing concern as software deals stretch into longer and longer terms of years.

Under GAAP, these tech companies have wide latitude to define cloud revenue, but it’s hard to see how providers stack up when their earnings reports may reflect not just software sales, but also servers, maintenance, other tools, infrastructure and consulting fees.  It makes comparisons difficult, and the suspicion becomes that companies are inflating their “cloud” revenue reporting in an effort to demonstrate (read: impress the markets with) stronger cloud sales growth.

As a result, the perceptions of such strong sales growth can amplify a company’s stock, since organizations today are being “rewarded with higher multiplies [i.e., stock price values] for cloud revenue,” notes the Journal.  It’s becoming “highly complex” to determine exactly how firms are booking cloud revenues.

The Journal article points out the need for greater clarity and detail from these large vendors when reporting cloud revenue.  Since companies have a lot of leeway in their reporting, standards are non-existent, and it’s becoming harder to separate the marketing fluff from the actual cloud performances of these firms.

Look for further confusion for awhile, while companies figure out how to provide more clarity around just what exactly comprises cloud.

 

 

Read Full Post »

hotel californiaOur title today actually applies to one of the problems with using the cloud today (sorry, Eagles fans).  Most major cloud providers today (Salesforce, Microsoft, Google, Amazon) use different cloud technologies that make switching from one to another a bit of a nightmare for users, who are required to rewrite much of their software to fit a competing cloud.  That’s a problem, one dubbed the Hotel California problem by Jeremy King, CTO of Wal-Mart Stores’ e-commerce division, and quoted in the Sept. 6th issue of Bloomberg Businessweek.

Bloomberg predicts that within the next five years about 33% of companies using the cloud will switch providers to get lower prices or add more features, or to add another provider to get servers closer to customers, or for backup processes.

For these reasons, a big new story in the cloud provider world has been the rise of “container software” from companies like Docker, among several others.  Container software serves the purpose of chopping up and isolating applications to make transitions from one cloud to another easier.  About a dozen container companies are currently vying for space in this market.  The goal is to help keep businesses from getting stuck with a single could provider whose service may not be in their long-term best interests.

Container software breaks up cloud code into separate pieces “each bundled with all the basic system software [they] need to operate independently of whichever plays host.”  That in turn has the potential to eliminate an awful lot of code required for each new cloud operating system and platform.  This is especially useful for apps that evolve into global hits with millions of users.

“Moving containers from one cloud provider to another is as simple as downloading them onto new servers,” notes Jim Zemlin, executive director of the Linux Foundation.

Of course, a dozen market choices for selecting the right container software also has the potential to gum up the works, as companies seek to find the right horse to ride.  “You could pick the wrong horse, just like VHS vs. Betamax,” notes David Linthicum, a cloud app creator and consultant.

To address this concern, industry groups have begun working on common standards.  Microsoft and Amazon are leading the Open Container Initiative founded in June.  (Microsoft publishes Microsoft Dynamics NAV, a core software offering of our firm.)  Meanwhile, the Cloud Native Computing Foundation was formed the next month, which includes Google, IBM, Intel and others, and focusing on “orchestration software.”  Each foundation is operating with a certain sense of urgency.

Amazon Web Services vice president Ariel Kelman says he’s not worried about a customer base that’s more capable of shopping around.  “Our customer retention strategy is to delight our customers, not to lock them in,” he notes.  Meanwhile, Linthicum thinks it will take two years for the groups to deliver their final standards, as the providers strive to make switching between containers easier.

 

Read Full Post »

cloud_money_imageThe website technologyevaluation.com recently wrote about why, despite all the chatter these days (good and bad) about The Cloud, manufacturers are not rushing into SaaS (Software as a Service) based enterprise resource planning (ERP) products.  Their points are worth noting, which we’ll do today in brief, adding our own commentary where appropriate.

  1. Multiple modification requirements. Manufacturing is complex, often including high volumes of unique transactions.  ERP systems for manufacturers almost always must include consideration for unique customizations.  While some cloud ERP products can accommodate low-level changes, the SaaS model in general is the antithesis of a customizable solution.  While software publishers may hope you buy in – so they can host and serve the same software once across many paying customers – the model is not such a good fit for most manufacturers.  A system with ‘standard’ functionality will usually not meet the demanding requirements of today’s manufacturer.
  2. Data ownership and overall increased dependency on a third party. From temporary outages to “the theoretical possibility that a provider might file for bankruptcy” (in the words of com), along with the simple logistical concerns about the availability of one’s data, cloud solutions pose a distinct risk to manufacturers.  You’re putting your company data in the hands of an entity you do not control.  On the other hand, you may already be sharing lots of your company data with your bank, creditors, suppliers or others, and thus already sharing to some degree.
  3. Strict compliance requirements. Cloud ERP providers may or may not be able to accommodate any unique compliance regulation relevant to your particular niche.  Your requirements may simply not match up well with a provider’s infrastructure, deployment method or, for example, a requirement to use separate servers for separate functionalities.
  4. Security concerns. Cloud providers will increasingly be heard boasting about security that’s even higher than on-premise solutions, and increasingly those boasts will ring true.  However, it’s always been the case with security that the weakest link is not in the data center – whether that be on-premise or off-premise – but rather, with the user.  On-premise systems can in some cases provide high–levels of field or record level data security; some cloud providers may offer this with their Service Level Agreements, but not all do, and it can be expensive.
  5. Leasing vs. Purchasing. Although leasing a la cloud offerings seems cost-effective in the short run (and it is), it’s often just the reverse in the long-run, that is, when taking a total cost of ownership (TCO) view.  Think of it this way: Are you the kind of auto buyer who likes a steady ongoing lease payment that never ends, or are you more inclined to buy on payment terms, content in the knowledge after a few years that ‘now I own it’ and the payments have ended?
  6. Integration with other corporate applications. ERP systems frequently need to integrate, usually in some data reach-out or back-door manner, with other internal systems.  Your ERP system will likely need to integrate with other cloud and on-premise systems.  This is almost always true in manufacturing firms.  The total cost of doing so will often be higher in the cloud deployment model than might be the case in a strictly on-premise environment.

While some or all of these considerations can be mitigated by cloud solutions, the real question is the cost and complexity of doing so.  In all cases, special due diligence is highly recommended.

As the old saying goes: You know who the pioneers are, right?  They’re the ones up front with the arrows in their backs.  Proceed accordingly.

 

Read Full Post »

fog computingLet’s start with a basic premise: as Christopher Mims of the Wall Street Journal put it in a May 19th article, “Getting data into and out of the cloud is harder than most engineers often are willing to admit.”

The problem is bandwidth.

Granted, if you want to do save the cost and trouble of storing data, the cloud works well when all you want to do is transfer data via high speed wiring.  Although frankly, even that can be problematical for the many businesses who still have less than perfect Internet bandwidth.

But in a world where mass connectivity requires true bandwidth that includes a wide array of mobile devices, users and providers alike struggle with the limitation of wireless networks.  According to the World Economic Forum, Mims points out, the U.S. ranks only 35th in bandwidth per user.

This has given birth to mobile apps becoming a predominant way to do things on the Internet, especially with smartphones.  Some of the data and processing power is actually being handled on your device – not in the “pipes” (or in this case, air waves) that lead to it.

And the issue of “getting things done” is becoming more problematical as the Internet becomes the Internet of Things, where “smart” devices can sense their environment and even receive commands remotely.  We’ll see this before long (already can, actually) in everything from drones to your refrigerator.

The problem is, modern 3G and 4G networks are not up to the task – they’re simply not fast enough – to transmit data from devices to the cloud at the pace it’s generated.  As an example, Mims cites Boeing, where “nearly every part of the plane is connected to the Internet” often sending continuous streams of status data to the cloud.  One jet engine alone is said to generate half a terabyte of data per flight.

Luckily, there’s a solution: Stop focusing on the cloud and start storing and processing these torrents of data on the devices themselves, or on devices that sit between our things and the Internet.  Someone at Cisco Systems has coined a name for this: “fog computing.”

Whereas the cloud is up in the sky, the fog is close to the ground, that is, where things are getting done.  It consists of weaker and dispersed little computers that are making their way into cars, appliances, factories and most any other new-tech object you can conjure.  In fact, Cisco plans to turn its routers into “hubs for gathering data and making decisions about what to do with it.”  These smart routers will never talk to the cloud unless they have to, say in a rail emergency or other critical application.

While the cloud consists of physical servers in nearly uncountable numbers today, the fog consists of all the computers that are already all around us, tied together.  So our mobile devices might send updates to each other, instead of routing them through the cloud.  In fact, the fog could eventually compete with the cloud for some functions.

If all this is putting you into a fog, fear not.  While much of computing’s future lies in the cloud, Mims thinks the truly transformative computing will take place right here, in the objects surround us – in the fog.

Read Full Post »

cloud pic8In our prior two posts (here and here) we described the fundamentals of cloud computing, and looked at today’s three modes of deployment (public, private, hybrid) and three modes of service (SaaS, PaaS, IasS).  Right after we wrote those two posts, we came across an article in the Jan. 10th issue of the Wall Street Journal that we think puts all this into pretty good perspective.  We’ll recap highlights from that article here today…

According to the Journal report, in 2011 cloud spending in the U.S. accounted for around 7% of the $53 billion spent on all IT (Information Technology) expenditures.  That’s a pretty small, though growing, share.  Some businesses can save substantially by outsourcing various aspects of their infrastructure to cloud providers. 

But business owners and managers also need to be aware of the downsides, which to most small business owners include concerns over security and lack of control over their own data. There is still a measure of confidence to be gained by the knowledge that an owner can walk down the hall to his own servers, on his own premises, under his own power, and know that the software and the data that he runs his business on is a very tangible, controllable asset.  And while cloud data centers can rightfully boast of “five nines” (99.999%) data backup and redundancy, there’s nothing like being able to see it and touch it. 

That sense is if anything heightened the first time you’re running a cloud app and “the Internet goes down.”  (We know the feeling.)  Whether it’s a router problem in house, or a provider issue at a link in Chicago, or a power issue… it matters little when it’s your company that’s down.  And it does happen.  Witness Amazon’s scare last summer when electrical storms cut power to ten data centers around Washington, D.C. and the generators they shut down left thousands ‘off the grid’.

This leads one to the same conclusion as the Journal points out: “Operating servers both on-site and in the cloud is a very effective way of reducing risk… Business owners should make the decision based on the support they have available.”

In other words, while you can put things like file backup and email safely into the cloud – applications where a few hours or even a day of down time won’t kill you – companies are still best advised to keep the mission critical stuff (financials, order processing, etc.) on local resources where you stay in control. 

Quoting the Journal again, “Financial benefits and convenience aside, some entrepreneurs are still wary of the security risks.  The top drawback for small businesses adopting cloud services in 2012 was data security, according to IDC research.”

The prudent owner will step into the cloud gently, carefully.  That’s the essence of prudence, after all.  Start with something like off-site backup, or moving your email to the cloud, especially if you currently run Microsoft’s Exchange Server in-house.  The reduced internal IT expense, in both hardware and staff time can easily make the move cost justifiable.  We’ve found that the uptime, the remote (out of office) capabilities and the overall responsiveness are every bit as good as when we ran our email ourselves.  Except now we’ve freed up valuable IT resources for more important tasks – like client application development.

But when it comes to core operational tasks like your ERP or financial system, we advise heeding the advice of my old Slovenian grandfather (and apparent cloud expert), who said it best half a century ago: “Take it slow, keep it go.”

 

Read Full Post »

Older Posts »