Feeds:
Posts
Comments

Posts Tagged ‘cloud’

Today we’re sharing a few behind-the-scenes details about the latest incarnation of Microsoft Dynamics NAV as it continues its evolution into the cloud-first product now called Dynamics 365 Business Central.

Our comments come from various NAV blogs, with a special tip of the hat to long-time Italian NAV developer Roberto Stefanetti.

To begin with, as a cloud-first product, expect to see even more frequent updates to Business Central than we saw under NAV, which used to be updated about every 18 months, and then more recently, annually.  There is becoming a lot more documentation available to resellers including courses and videos, with more expected.  A BC-dedicated forum has recently come online.

Because the BC product is cloud-first by design, Microsoft will always be updating that product first, with upgrades to server-based (or in-house based) systems afterwards.  Using a cloud version of an ERP product means in effect that you are always up to date.

The new development environment, “born first to Business Central” states Stefanetti, allows us to create ‘extensions’ which can then be ‘certified’ (or not), which allows for customization of the core product in what some call a “less invasive” fashion (i.e., less labor-intensive for upgrades later) than in the past.

Business Central will now be able to offer native (i.e., cloud-integrated) services for Outlook, Office 365, Microsoft CRM, Power BI and Flow, to name a few.  Users will now be able to grow into a ‘virtual desk’ in the cloud – and with your ERP system now there, you’ll never have to leave the cloud.

For its ‘public’ cloud of course Microsoft will feature Azure, which has now become the second most popular public cloud platform in the world (behind Amazon Web Services).  Azure offers a public cloud hosted experience that features a multi-tenant database (many customers running off the same database), an architecture based on events, extensions for customization development, and an App store as a source for distributing apps to others.

From a cost standpoint, you’ll be paying by the user/month, so you’re only paying for the ‘amount’ of software you truly need, and there will be several types of users, at different price points.  Think of it very much like a car lease – you pay a fixed rate every month based on user counts and types, and your system is always kept up to date.  One price, generally speaking, can be made to cover the application, all hosting for users, maintenance/upgrades, and probably even additional services (like Office, etc.).  That’s a conversation you have with your Business Central business partner (i.e., reseller).  It’s a strong move into today’s new billing model: subscription pricing, and we’re seeing it everywhere, not just in software.

Given the benefits noted above, and since we haven’t had a chance to test all the limitations in the new model, we’ll quote verbatim what consultant Stefanetti has to say on the subject.  It’s important to note that his comments on limitations are specific to the Microsoft Azure public-cloud hosting option.  Partners (like us) are able to offer OTHER non-Azure options that avoid some of the stated limitations.  Nonetheless, of the Azure/public approach, Stefanetti notes as follows:

The system is closed (but secure). You can’t access SQL Server and databases. Only the environment-specific tenant that you have purchased exists. You can’t create development environments, only sandboxes in the same tenant for the purpose of testing the data. Therefore, the modality of the approach is very different from the on-premises world.

It is not possible to back up the database because we do not access SQL Server. The system does not go down but it is possible to restore data if necessary. The backup is managed by Microsoft, with no way to schedule an auto-backup. Therefore, a backup cannot be launched by the end user, but if necessary, it is possible to open an issue to Microsoft and they can provide a restore.

You can use RapidStart Services Packages to export data, but that isn’t a real backup system (you can’t restore your database after a crash failure) like an on-premises system. Rather, this tool allows you can export for example the “setup data” for master tables, and secondary tables (a copy of setup).

Sorry, that’s all the space (and then some) we have today, but we’ll continue to cover more Business Central update details in the future, just as we’ve been doing.  Stay tuned.

 

Read Full Post »

According to The Wall Street Journal (1/17/18), Google is expanding its already large network of undersea cables to access new regions around the world not currently well served by its competitors, as well as to give itself some rerouting capabilities if a region fails or gets overloaded.  It’s all part of the now unstoppable growth of cloud computing services, as well as a play to keep up with its two greatest competitors, Amazon and Microsoft.

Google VP Ben Treynor professes that he “would prefer not to have to be in the cable-building consortium business” but found there weren’t a lot of other options.

Google parent Alphabet already owns a massive of network of fiber optic cables and data centers, already handling about 25% of the world’s internet traffic.  These facilities allow Google to control its data-intensive software without having to rely on the bit telecommunications providers.

After a decade of construction, Google will soon have 11 underwater cables around the world.  They’re used to “refresh research results, move video files and serve cloud computing customers” around the globe.

And at that, Google currently ranks third in cloud-computing revenue behind Amazon and Microsoft in the biggest tech race going on the planet now.  Billions of dollars of revenue annually are at stake, the Journal points out, as companies increasingly move various data operations to the cloud.

Currently, its longest cable stretches 6,200 miles from Los Angeles to Chile.  But Google has also teamed up with others, like Facebook, for its latest build-out.  They plan to share capacity on a 4,500 mile cable from the east coast of the U.S. to Denmark, with an additional terminal in Ireland, thus increasing its bandwidth across the Atlantic.

Another cable of 2,400 miles will be run from Hong Kong to Guam, hooking up with cable systems from Australis, East Asia and North America.

The internet build-out continues, as does the march to cloud dominance.  Only today, you’ve got to have a few billion dollars in your pockets to play.

 

Read Full Post »

Cloud storage services have become big business.  And a handful of familiar names are quickly grabbing up ownership of most of it.

For starters, Amazon is by far the largest cloud provider today, garnering (according to Amazon Web Services CEO Andy Jassy) “several times as much business as the next 14 providers combined.”

Microsoft is next largest in terms of sales of the infrastructure services that store data and run applications.  But at last read, they were still less than one-fifth Amazon’s size.

And Google places third, even though they are now by market value the second-largest company in the world, though with only about one-fifteenth of Amazon’s cloud business revenues.

Still, Microsoft and Google aren’t standing still.  Microsoft’s cloud unit, called Azure, has won over some large customers lately including Bank of America and Chevron.  They are said (according to a recent article in Businessweek) to have done it by focusing on salesmanship and relationship building skills, something not necessarily the forte of the Amazon business model.  Microsoft CEO Satya Nadella has pushed his sales force into “a roving R&D lab and management consultancy.”  They’re hooking up smaller startups with potential investors and giving larger prospects access to a sales team that helps them market their Azure-based apps to their own customers.  Win-win.

Microsoft is also increasingly moving its traditional Office suite to the cloud via initiatives like Office 365 and the new Dynamics 365 products and branding.  This makes it more likely that when companies consider moving off their own data centers they’ll consider Microsoft favorably, exploiting that existing relationship when it comes to migrating to a public cloud.

To step up its game, Google recently hired the co-founder of VM Ware, Diane Greene, to run its cloud business, starting with a cloud sales force they are building from scratch.  Google also recently announced a partnership with Salesforce.com to take advantage of its list of preferred cloud providers, according to Businessweek.

One big advantage both Google and Microsoft will try to exploit over Amazon is the fact that Amazon often competes fiercely with many of its own prospective cloud clients.  Wal-Mart and others are not keen on seeing their AWS payments benefit the very retailer they most compete against.

It’s still too early to say who will end up on top, but the battle is fierce, and you can expect all three of these tech titans to be in that mix for years to come.  It’s already a $35 billion market that’s projected to grow to about $90 billion within four years according to Gartner analysts.

As AWS’s Jassy notes, “This is the biggest technology shift of our lifetimes.”

 

Read Full Post »

cloud confusionThe cloud is just another computer somewhere else.

That clever little phrase pretty well embodies What Cloud Is as well as anything you’ll probably read.

But, it turns out, assessing just how well and how fast cloud computing is growing is a bit more problematical.  This comes to a light in a February 2, 2016 article by Angus Loten and Kimberly Johnson of the Wall Street Journal.

It seems these days that some very large cloud providers, including Microsoft, IBM, Oracle and others have been a little fuzzy about their numbers and what they classify as “cloud” services.

Gartner researchers expect 17% growth in cloud services to $204 billion this year.  But some providers use the cloud term in marketing and sales pitches, as well as earnings reports, to include a very wide range of services, according to senior tech executives, industry analysts and the Journal’s reporting.

As the CIO of General Electric noted, “Vendors are constantly showing up and trying to sell you everything as cloud.”  It takes “extra work” to determine what are and aren’t true cloud services that will be around for the long haul.  That’s a growing concern as software deals stretch into longer and longer terms of years.

Under GAAP, these tech companies have wide latitude to define cloud revenue, but it’s hard to see how providers stack up when their earnings reports may reflect not just software sales, but also servers, maintenance, other tools, infrastructure and consulting fees.  It makes comparisons difficult, and the suspicion becomes that companies are inflating their “cloud” revenue reporting in an effort to demonstrate (read: impress the markets with) stronger cloud sales growth.

As a result, the perceptions of such strong sales growth can amplify a company’s stock, since organizations today are being “rewarded with higher multiplies [i.e., stock price values] for cloud revenue,” notes the Journal.  It’s becoming “highly complex” to determine exactly how firms are booking cloud revenues.

The Journal article points out the need for greater clarity and detail from these large vendors when reporting cloud revenue.  Since companies have a lot of leeway in their reporting, standards are non-existent, and it’s becoming harder to separate the marketing fluff from the actual cloud performances of these firms.

Look for further confusion for awhile, while companies figure out how to provide more clarity around just what exactly comprises cloud.

 

 

Read Full Post »

hotel californiaOur title today actually applies to one of the problems with using the cloud today (sorry, Eagles fans).  Most major cloud providers today (Salesforce, Microsoft, Google, Amazon) use different cloud technologies that make switching from one to another a bit of a nightmare for users, who are required to rewrite much of their software to fit a competing cloud.  That’s a problem, one dubbed the Hotel California problem by Jeremy King, CTO of Wal-Mart Stores’ e-commerce division, and quoted in the Sept. 6th issue of Bloomberg Businessweek.

Bloomberg predicts that within the next five years about 33% of companies using the cloud will switch providers to get lower prices or add more features, or to add another provider to get servers closer to customers, or for backup processes.

For these reasons, a big new story in the cloud provider world has been the rise of “container software” from companies like Docker, among several others.  Container software serves the purpose of chopping up and isolating applications to make transitions from one cloud to another easier.  About a dozen container companies are currently vying for space in this market.  The goal is to help keep businesses from getting stuck with a single could provider whose service may not be in their long-term best interests.

Container software breaks up cloud code into separate pieces “each bundled with all the basic system software [they] need to operate independently of whichever plays host.”  That in turn has the potential to eliminate an awful lot of code required for each new cloud operating system and platform.  This is especially useful for apps that evolve into global hits with millions of users.

“Moving containers from one cloud provider to another is as simple as downloading them onto new servers,” notes Jim Zemlin, executive director of the Linux Foundation.

Of course, a dozen market choices for selecting the right container software also has the potential to gum up the works, as companies seek to find the right horse to ride.  “You could pick the wrong horse, just like VHS vs. Betamax,” notes David Linthicum, a cloud app creator and consultant.

To address this concern, industry groups have begun working on common standards.  Microsoft and Amazon are leading the Open Container Initiative founded in June.  (Microsoft publishes Microsoft Dynamics NAV, a core software offering of our firm.)  Meanwhile, the Cloud Native Computing Foundation was formed the next month, which includes Google, IBM, Intel and others, and focusing on “orchestration software.”  Each foundation is operating with a certain sense of urgency.

Amazon Web Services vice president Ariel Kelman says he’s not worried about a customer base that’s more capable of shopping around.  “Our customer retention strategy is to delight our customers, not to lock them in,” he notes.  Meanwhile, Linthicum thinks it will take two years for the groups to deliver their final standards, as the providers strive to make switching between containers easier.

 

Read Full Post »

cloud_money_imageThe website technologyevaluation.com recently wrote about why, despite all the chatter these days (good and bad) about The Cloud, manufacturers are not rushing into SaaS (Software as a Service) based enterprise resource planning (ERP) products.  Their points are worth noting, which we’ll do today in brief, adding our own commentary where appropriate.

  1. Multiple modification requirements. Manufacturing is complex, often including high volumes of unique transactions.  ERP systems for manufacturers almost always must include consideration for unique customizations.  While some cloud ERP products can accommodate low-level changes, the SaaS model in general is the antithesis of a customizable solution.  While software publishers may hope you buy in – so they can host and serve the same software once across many paying customers – the model is not such a good fit for most manufacturers.  A system with ‘standard’ functionality will usually not meet the demanding requirements of today’s manufacturer.
  2. Data ownership and overall increased dependency on a third party. From temporary outages to “the theoretical possibility that a provider might file for bankruptcy” (in the words of com), along with the simple logistical concerns about the availability of one’s data, cloud solutions pose a distinct risk to manufacturers.  You’re putting your company data in the hands of an entity you do not control.  On the other hand, you may already be sharing lots of your company data with your bank, creditors, suppliers or others, and thus already sharing to some degree.
  3. Strict compliance requirements. Cloud ERP providers may or may not be able to accommodate any unique compliance regulation relevant to your particular niche.  Your requirements may simply not match up well with a provider’s infrastructure, deployment method or, for example, a requirement to use separate servers for separate functionalities.
  4. Security concerns. Cloud providers will increasingly be heard boasting about security that’s even higher than on-premise solutions, and increasingly those boasts will ring true.  However, it’s always been the case with security that the weakest link is not in the data center – whether that be on-premise or off-premise – but rather, with the user.  On-premise systems can in some cases provide high–levels of field or record level data security; some cloud providers may offer this with their Service Level Agreements, but not all do, and it can be expensive.
  5. Leasing vs. Purchasing. Although leasing a la cloud offerings seems cost-effective in the short run (and it is), it’s often just the reverse in the long-run, that is, when taking a total cost of ownership (TCO) view.  Think of it this way: Are you the kind of auto buyer who likes a steady ongoing lease payment that never ends, or are you more inclined to buy on payment terms, content in the knowledge after a few years that ‘now I own it’ and the payments have ended?
  6. Integration with other corporate applications. ERP systems frequently need to integrate, usually in some data reach-out or back-door manner, with other internal systems.  Your ERP system will likely need to integrate with other cloud and on-premise systems.  This is almost always true in manufacturing firms.  The total cost of doing so will often be higher in the cloud deployment model than might be the case in a strictly on-premise environment.

While some or all of these considerations can be mitigated by cloud solutions, the real question is the cost and complexity of doing so.  In all cases, special due diligence is highly recommended.

As the old saying goes: You know who the pioneers are, right?  They’re the ones up front with the arrows in their backs.  Proceed accordingly.

 

Read Full Post »

fog computingLet’s start with a basic premise: as Christopher Mims of the Wall Street Journal put it in a May 19th article, “Getting data into and out of the cloud is harder than most engineers often are willing to admit.”

The problem is bandwidth.

Granted, if you want to do save the cost and trouble of storing data, the cloud works well when all you want to do is transfer data via high speed wiring.  Although frankly, even that can be problematical for the many businesses who still have less than perfect Internet bandwidth.

But in a world where mass connectivity requires true bandwidth that includes a wide array of mobile devices, users and providers alike struggle with the limitation of wireless networks.  According to the World Economic Forum, Mims points out, the U.S. ranks only 35th in bandwidth per user.

This has given birth to mobile apps becoming a predominant way to do things on the Internet, especially with smartphones.  Some of the data and processing power is actually being handled on your device – not in the “pipes” (or in this case, air waves) that lead to it.

And the issue of “getting things done” is becoming more problematical as the Internet becomes the Internet of Things, where “smart” devices can sense their environment and even receive commands remotely.  We’ll see this before long (already can, actually) in everything from drones to your refrigerator.

The problem is, modern 3G and 4G networks are not up to the task – they’re simply not fast enough – to transmit data from devices to the cloud at the pace it’s generated.  As an example, Mims cites Boeing, where “nearly every part of the plane is connected to the Internet” often sending continuous streams of status data to the cloud.  One jet engine alone is said to generate half a terabyte of data per flight.

Luckily, there’s a solution: Stop focusing on the cloud and start storing and processing these torrents of data on the devices themselves, or on devices that sit between our things and the Internet.  Someone at Cisco Systems has coined a name for this: “fog computing.”

Whereas the cloud is up in the sky, the fog is close to the ground, that is, where things are getting done.  It consists of weaker and dispersed little computers that are making their way into cars, appliances, factories and most any other new-tech object you can conjure.  In fact, Cisco plans to turn its routers into “hubs for gathering data and making decisions about what to do with it.”  These smart routers will never talk to the cloud unless they have to, say in a rail emergency or other critical application.

While the cloud consists of physical servers in nearly uncountable numbers today, the fog consists of all the computers that are already all around us, tied together.  So our mobile devices might send updates to each other, instead of routing them through the cloud.  In fact, the fog could eventually compete with the cloud for some functions.

If all this is putting you into a fog, fear not.  While much of computing’s future lies in the cloud, Mims thinks the truly transformative computing will take place right here, in the objects surround us – in the fog.

Read Full Post »

Older Posts »