Feeds:
Posts
Comments

In our prior post, we pointed out that China is on the verge of becoming the world leader in the production, sale and implementation of robots, with a stated goal of producing at least half the nation’s own robots for manufacturing by 2025.  The takeaway from that view, outlined recently in Bloomberg BusinessWeek might be that the world has much to fear from the ascendance of this wave of Chinese bots.

But a recent counterpoint to such a robot apocalypse offered by Greg Ip of the Wall Street Journal suggests that in fact, robots aren’t destroying enough jobs, fast enough.

In short, Ip points out that by enabling society to produce more with the same workers, automation like robots becomes a major driver of rising standards of living – in effect, a productivity boost.  While some say that “this time is different” because the technological change is so profound they fear that millions of workers will be out of work or at best consigned to more menial tasks… Ip says the evidence shows we’re moving in exactly the opposite direction.

He notes that while the U.S. “has many problems, job creation isn’t one of them.”  Job creation has averaged 185,000 per month this year and unemployment is down to a ten year low.  Wage gains are even up, slightly.  Ip says that “if automation were rapidly displacing workers the productivity of the remaining workers ought to be growing rapidly.”  Instead, worker output per hour has been dismal in most sectors, including manufacturing.

When slow-growing occupations are compared to fast-growing ones in data going back to 1850 (a proxy for job creation and destruction driven by ‘technology’), they find that churn relative to total employment today is the lowest on record.

Ip’s point is that the past was, in fact, much more ‘convulsive’ than today’s job churn.  American consumption he notes is gravitating toward goods and services whose production is not easily automated.  Societies increasingly are devoting “a growing share of their income to consumption in sectors where productivity [is] stagnant.”  The idea is that robots can replace fewer things that go into GDP than we think.

As examples he cites medical breakthroughs in new, more expensive treatments rather than cheaper existing treatments, and that child-care work has soared because parents won’t leave kids in the care of a robot.  Over the past decade, “low productivity sectors” including education, health care, social assistance, leisure and hospitality have added nearly 7 million jobs, whereas information and finance, where value added per worker is 5 to 10 times higher, have cut or barely added jobs.

His conclusion: We need a change in priorities.  Instead of worrying about robots destroying jobs, we need to use them more, especially in low-productivity sectors.  While robots may one day replace truck drivers, “it’s more urgent to make existing drivers, now in short supply, more efficient,” and to be more concerned about reducing the labor, and thus the cost of energy, rather than worry about jobs added in areas like solar power.  The alternative, notes Ip, “is a tightening labor market that forces companies to pay ever higher wages that must be passed on as inflation.  And that, he notes, “is a more imminent threat than an army of androids.”

 

Industrial automation continues to progress, and nowhere is that happening at a faster clip now than in China.

Robots are making rapid headway in many plants around the world.  Currently South Korea leads in adoption with some 531 robots per 10,000 workers in 2015, followed by Germany at 301, and the U.S. at 176.  At 49 per 10,000, China lags, but is determined to catch up and surpass other nations.  For the first time, in 2013 more robots were sold in China than anywhere else in the world.  Last year, 90,000 robots were installed in China, fully one-third of the world’s total – all in an accelerating effort to counter higher labor costs.

The same fervor that made China a leader in solar panels and high speed rail is now embraced by its planners in the adoption of robots in factories.  And everyone should be concerned.

China has an aging workforce and increasing labor costs, and industrial automation there is crucial.  But as John Roemisch, a VP at robot manufacturer Fanuc America Corp. says, “There’s nothing keeping them from coming after our market.”  As the CEO of IRobot adds, “China has a great history of being an effective fast follower… The question will be, can they innovate?”  To that end, China has three enormous advantages: scale, growth momentum and money, according to a May 7th article in Bloomberg BusinessWeek.

Through a sweeping proposal released in China dubbed “Made in China 2025,” Beijing will focus on automating key sectors of the economy that include car manufacturing, electronics, appliances, logistics and food production.  At the same time, they plan to increase their own production of robots to over 50% of total Chinese sales by 2020.

A Chinese start-up named E-Deodar has developed proprietary technology that allows it to create $15,000 factory robots, a cost about one-third cheaper than foreign ones.  It has mastered technology for servomotors, drivers and control panels to gain a proprietary competitive advantage, and is said to act much like a Silicon Valley startup.  Says it’s General Manager Max Chu, “People ask me, how long can you make robots?” he says. “I say it’s simple, we will make robots until there’s no more people in factories.”

The U.S. is not sitting idle.  We’ve all seen the Roomba vacuums that promise to make domestic life a bit easier.  China and others nations are now collaborating on building them for the global market.   But while building vacuum cleaners is one thing — the industrial goal is here is much bigger.  Amazon, for instance, hopes to build logistics systems that create near-humanless warehouses with packages delivered by drones or driverless vehicles.  JD.com is rushing to automate its business in the quest to replace tens of thousands of warehouse workers and deliverymen.  It’s currently testing drones to deliver packages in rural regions and experimenting with robots to deliver on college campuses, according to Bloomberg.

As its Chief Technology Officer said recently, it’s all about “who can learn… who can get better faster.  We are all just starting out.”

But while this might be the most disconcerting part of all for today’s low-skill worker, an interesting — and completely opposite — counterpoint has been voiced by Wall Street Journal editor Greg Ip.  We’ll provide Ip’s counterpoint in our next and concluding post on the ascendance of robots. Stay tuned…

 

As most of us have noticed by now, the pace of technology – long proceeding at a snail’s pace as generation after generation lived more or less as their parents had – has been accelerating at what to many feels like a breakneck rate.  We’ve gone from linear progress to exponential, moving from the industrial revolution to the current digital one at an ever-quickening pace.

Moore’s Law, now over 50 years old, postulated that the number of transistors per square inch on a circuit board would double every year or two since – and today, that continued pace means that exotic technologies that include AI (artificial intelligence), robots, cloud computing and 3-D printing systems are proliferating, evolving in many cases faster than we humans can keep up.  It seems like things keep getting faster, smaller and smarter.

And therein lies the downside of all this technical innovation, says Gary Smith, a logistics expert with the New York City Transit in a recent issue of APICS Magazine.  Smith believes that “the rate of technological change exceeds the rate at which we can absorb, understand and accept it.”  This is acutely true in the world of supply chain, with its deep reach into manufacturing, distribution and just business in general.

Most importantly he notes that “disruptive technologies require a workforce that adapts to new processes, new ways of learning and training systems.”  In that spirit, he suggests key considerations and qualities that are going to be important within supply chains of the future, ones that the next generation workforce can expect to have to incorporate into their work patterns.  Among them:

  • Data analysis and database development skills. The ability to analyze and produce actionable results from data using logic and fact with insightful opinions and interpretation of available data will be critical.
  • Critical thinking. It’s vital to data analysis and fact-based decision making.  The ability to quickly acquire knowledge and break it down into its logical components, and then analyze and drill for accurate and actionable conclusions matters.  You have to be able to take complex situations and break them down into their component tasks.  Or as Franklin Covey would say, “start with the end in mind.”  Critical thinking means “abstraction, systems thinking, experimentation and collaboration,” notes Gary Smith.  To wit:
  • Abstraction. The ability to discover patterns in data.  Often, lessons from one industry can be applied to another, for example.
  • Systems thinking. That is, viewing issues as a part of the whole – how issues relate to the rest of a system.  Often, the “good of the many outweighs the good of the few.”
  • Experimentation. Complex problems require trial and error, testing and experimentation.  It’s okay to fail, as that’s part of learning.  Fail fast, think differently and learn to adapt as new conditions present themselves.
  • Collaboration. It’s working with others toward the common goal.  Easier said than done.  It requires team-building and facilitation skills, along with everyone keeping their eyes on the prize.  Collaboration is particularly important in supply chain and ERP work, where silos need to be broken down and people need to cooperate and effectively communicate.

These are the critical skills companies will be looking for.  We see the need for it every day in ours, and we’re only one of many.  So in a very real sense, the future really is now.

In our prior post (“The Evolving Promise of Unbreakable Computer Security”) we suggested that the evolution of quantum computing would make it possible to create virtually unbreakable computer security due to its ability to create almost absurdly difficult encryption through the use of complex prime number “decompositions” that even today’s super-computers could not solve.

As a counterpoint today we offer the opinions of editors at The Economist, offered in the April 8, 2017 issue, in which they suggest that “computers will never be secure,” and that “to manage the risks, [we need to] look to economics rather than technology.”

We’ll let you be the judge as to which opinion might ultimately prevail.

The Economist editors suggest that “computer security” itself is a contradiction in terms.  Hardly a day passes that we don’t read about the latest cyber-attack (we’ve helped several of our ERP clients after they were harassed or held hostage through ransomware attacks).  Recently the central bank of Bangladesh lost $81 million… Yahoo almost torpedoed its sale to Verizon due to massive data breaches… and allegations persist about Russian hacking of the U.S. elections.

Today, there is a huge black market for data theft and extortion tools, including hackers for hire.  And soon enough, the Internet of Things (of which we’ve written frequently) will present even more devices that never expected to be hacked, but are ripe for attack.  The bottom line is “there is no way to make computers completely safe.  Software is hugely complex.”

And perhaps common sense would dictate that when you have millions of lines of code, like Google or Microsoft, errors are inevitable.  The Economist states that “the average program has 14 separate vulnerabilities, each a potential point of illicit entry.”  And after all, we are reminded, there’s the internet, where security was pretty much an afterthought.

So, what to do?  According to the Economist’s editors, it’s all about managing the risk.  Their suggestions include:

  • Start with regulation. If you can’t weaken encryption for just the bad guys, then make sure encryption is strong for everyone.  “The same protection that guards messages in WhatsApp also guards bank transactions and online identities.”
  • Set basic product regulations. They suggest promoting “public health” for computing, with solutions ranging from “internet-connected gizmos” that are updated when flaws are found, to forcing users to change passwords and user names often.  Enforce reporting laws already in place that make companies disclose when they are hacked.
  • Overall, says The Economist, incentives to take security seriously are weak, and the long-established disclaiming of liability by providers may soon bump up against traditional protection and liability laws, especially where computer products become embedded in devices traditionally protected by established liability law. In other words, the courts may one day force the liability issue.  And there’s nothing like the government to come down hard with new rules.
  • Cyber liability insurance. It’s a small but growing market for protecting consumers.  Product companies may soon find buying it preferable to the destructive consequences they might otherwise assume in liability cases.

Finally, they note that when the internet was new, no one took security seriously, and no one objected.  But today’s internet is ubiquitous, and not taking security seriously, given the known risks and consequences, is no longer forgivable.  As the editors conclude, “changing attitudes and behavior will require economic tools, not just technical ones.

The idea of “quantum computing” has been around for a while, but lately, some very blue-chip sorts of companies have begun investing in it seriously.  Names like HP, Google and Microsoft, to name a few.

Quantum computing is best thought of as the ‘next generation’ of computing technology, in which the weird and dazzling properties of the atomic and sub-atomic worlds govern what a computer is capable of.  Quantum theory was born about a century ago, but its practical use has long been out of man’s reach.  But the day is coming. Everything in the natural world can be described by quantum mechanics – but it operates on a very different plane from the natural order of things we humans have come to know.  And sometimes, quantum properties can act downright… weird.

For example, in computers, the fundamental notion of a “bit” of information is defined by a flow of electric current that, like a switch, is either “on” or “off.”  There’s no confusion, and that foundation allows computers to work from flowing electrons, and software programmers to create code that depends on it.  But in the quantum world, things aren’t so simple.

Without veering off into strange properties and the famous Heisenberg Principle which says that the mere observation of an atomic particle or event can change its very nature (you can determine a particle’s direction of movement or its location, but not both at the same instant)… the bottom line is that a quantum bit can be both on and off simultaneously.  As scientists learn to harness the power of this notion of a ‘qubit,’ it promises to unleash phenomenally more powerful hardware and software than ever seen before.

Which brings us to the future of computing.

One of the most promising possibilities in quantum computing is that of unbreakable security.  The unique properties of this on-while-off status of a qubit gives it the capability of working out prime numbers that, when multiplied together make up ridiculously large primes whose reverse uncoupling (or “decomposition”) is mathematically extremely complex, and is the basis of most modern cryptography in use today.

The new algorithms produced by quantum computing promise to deliver cryptographic solutions that quantum computers can crunch through, but which are well beyond anything that even today’s supercomputers are capable of.

Meanwhile, companies like those mentioned earlier all have research programs for determining how best to harness these quantum capabilities in software and applications.  Early interest has come from governments and defense contractors, not to mention the NSA, as well as a growing number of startups.  These efforts are based on the work of Dr. Peter Shor who, at Bell Labs in 1994, first showed how a quantum computer would be capable of solving the prime riddle.

In the future, that capability would be useful “for all manner of currently intractable problems” notes a recent article in The Economist (March 11, 2017).  Applications including those requiring extremely precise timing, perfectly accurate GPS triangulation and massively complex encryption will likely be among early efforts.

While these machines and software are ultimately among mankind’s greatest engineering challenges, one tends to believe that in the long history of computing, they’re simply the next step on the trail, in the seemingly never ending evolution of the computer.

(Note: In our next post, we’ll present a counterpoint to our “unbreakable security” thinking above, courtesy of the editors at The Economist. Stay tuned…)

 

 

A bit off topic today, but we think those who work for a living and hope one day not to have to, may find this of interest in the pursuit of sound retirement investment strategies.

We are gradually learning these past few years that when it comes to making money by investing in the stock market, the lowly “exchange traded fund” (or ETF) that tracks a broad basket of stocks across many companies or sectors (the S&P 500 tracking funds being among the more obvious indexes) generally beat the returns of funds managed by human money managers.  In other words, the overall market, over time, will exceed what most of even the best money managers can do for you.

This of course is disconcerting to those who earn their living picking those stocks, or selling their funds.  Nonetheless, it’s proving true.  According to several sources we’ve reviewed, something like 86% of managed funds do NOT beat the market, or even the so-called “benchmarks” they are measured against.

The core of the problem is that it is very hard to beat the market.  Obviously.  But in this era of rocket scientist, algorithm producing, quant-based, big data stock picking… it doesn’t mean folks aren’t trying. And that may be the problem.

With today’s computing power and increasing wealth of raw data, it is possible to test thousands, even millions of data sets, ideas and trading philosophies.  The standard method of doing this involves something called “backtesting” in which someone comes up with a market hypothesis, and then looks back over, say, twenty years, to see how their strategy would have performed against real markets, with their unpredictable ups and downs, over that time.  To check the validity of their results, the technique is then checked against “out-of-sample” data, consisting of market history that was not used to create the original technique.

But in the wrong hands, things can go, well… wrong.  There is a powerful temptation to get published in finance journals among researchers, analysts and economics.  Too often, this leads to “torturing the data” as a recent Bloomberg BusinessWeek article pointed out (April, 2017).  This in turn has led to some exchange-traded funds using flawed statistical techniques according to a couple of experts at Duke University, implying that “half the financial products promising outperformance that companies are selling to clients are false.”

For example, a batch of research involving United Nations data once found that the best (backtested) predictor of stock performance in the S&P 500 was butter production in Bangladesh.  That is to say, out of millions of data sets, tested backwards in time, the one with that came closest to tracking the S&P 500’s actual market performance over time was the output of butter production in a third world nation half-way around the world.

Researches can “twist  the knobs” on their assumptions in search of a prized “anomaly” that they can write about – or sell.  They can vary, say, the “time period covered, or the set of securities under consideration, or even the statistical method,” according to Bloomberg’s Peter Coy.  Negative findings get round-filed; positive findings get published — or make into an ETF whose performance we may be relying upon for our retirement.  With enough tests, notes Coy, “eventually by chance even your safety check will show the effect you want.”

So next time you read about some great new investment strategy vetted by a Wall Street hedge fund’s top “quants” (the math wizards who come up with stuff)… take a deep breath, turn the page, and leave your long-term funds in a plain old vanilla stock market index fund from Vanguard, Fidelity or American Funds.

It will help to ensure that when it’s time to retire, your money will be ready too.

We see clients wrestle every day with a number of common complaints, obstacles and productivity-wasters when they talk to us about upgrading their business management systems.  These issues seem to be common across many companies, so don’t feel too badly if they just so happen to describe your office too.  We’re talking about things like…

  • Spreadsheets.  They’re everywhere.  They’re disconnected.  They’re not available to everyone.  They can be difficult and costly to maintain and keep current.  And worst of all, they represent double- or even triple-entry effort across disparate platforms.  A lot of waste, redundancy and mistakes.
  • Information that’s all in one person’s head. Years ago, we had a client with a production scheduler who had all the magic formulas for How-To-Produce-What-On-Which-Machine (in what order) lodged inside his head, and his alone.  Job security, right?  I thought so too.  Until the company President told me that this fellow had already had two heart attacks!  They could joke about it among themselves (I mostly just kept my head down, and eventually developed a scheduler for them based on some of his knowledge). We have had many clients over the years where the institutional knowledge of certain critical functions was stored in the head of one person – often an owner.  Not exactly conducive to a happy exit strategy, is it?
  • Lack of inter-departmental communication. The classic “left hand doesn’t know what the right hand is doing” syndrome.  Usually it’s front office vs. back office, or production vs. shipping.  Sometimes, it’s like you’re working in two different companies – mostly because the information that needs to be shared simply isn’t in the right place at the right time.  The result is lots of trips out back (or front)… lots of intercom calls… lots of emails… and a whole lot of inefficiency, wasted steps, misspent energy, and “expedited” orders that become the norm.
  • Gut instinct and guesswork as stand-ins for accurate reporting and real business intelligence. When you don’t have the data, you guess — sometimes correctly, sometimes not so much.  Or you ask a person who really doesn’t know the answer.  Or you do it the way you’ve always done it because, hey… that’s the way we’ve always done it.

If some or all of these sound familiar – and they’re usually only the tip of the iceberg – you’re not alone.  That doesn’t mean you shouldn’t do something about it.  The sad truth is, companies lose tens of thousands, even hundreds of thousands of dollars each year to these sorts of inefficiencies – and they don’t even realize it!

What could it do your bottom line, your company’s value and your ability to serve the customer… if only you did realize it – and did something about it?

In the end, you may not be alone.  Misery loves company, right?  But is that really the competitive position you want to be in?  For now, just be glad you don’t know what it’s really costing you. And when you’re finally ready to do something about it, you’ll be taking the first step on the road to a better company.