Emily Chang is a journalist and weekly anchor of Bloomberg BusinessWeek, and the author of “Brotopia: Breaking Up the Boys Club of Silicon Valley.”  Recently, she penned an article there about an old (in tech terms) digital artifact by the name of Lena Soderberg.  Lena first became famous in November 1972 when, as Lenna Sjooblom, she was featured as a centerfold in Playboy magazine.  That spread might have been the end of it but for the fact that researchers at the Univ. of Southern California computer lab were busy trying to digitize physical photographs into what would eventually become the JPEG (or .jpg) format we all know from Internet images today.

According to the lab’s co-founder, William Pratt, now 80, the group chose Lena’s portrait from a copy of Playboy brought to the lab by a student.  The team needed to test their photo-digitization algorithms on suitable photos for compressing large image files so they could be digitally transferred between devices.  Apparently their search led them to Lena, the 21 year old Swedish centerfold.  Go figure.

Lena ended up becoming famous in early engineering circles, and some refer to her as “the first lady of the Internet.”  Others see her as Silicon Valley’s original sin – the larger point of Ms. Chang’s article – but that’s a topic for another post.

Apparently, Lena’s photo was attractive from a technical perspective because the photo included, according to Pratt, “lots of high frequency detail that is difficult to code.”  That would include apparently her boots, boa and feathered hat.

According to Ms. Chang, for the next 45 years, Lena’s photo (seen at the top of this post), featuring her face and bare shoulder, served as “the benchmark for image processing quality for the teams working on Apple Inc.’s iPhone camera, Google Images, and pretty much every other tech product having anything to do with photos.”

To this day engineers joke that if you want your image compression algorithm to make the grade, it had better perform well on Lena.

So to a lot of male engineers, Lena thus became an amusing historical footnote. But to their female peers, it was seen as “just alienating.”  And it has a lot to do with some of the inborn gender biases that permeate the tech industry to this day, where the majority of employees are still male.

That’s a much longer extract from Emily Chang’s essay that we’ll try to sum up in a succeeding post.  Stay tuned…

Mike Lazaridis is justly famous and wealthy for the being the guy who co-invented the Blackberry, the first ‘must-have ‘personal digital assistant.  And Lazaridis says he “won’t be iPhoned again.”

With colleague Doug Fregin, Lazaridis has poured nearly half a billion dollars into projects involving quantum computing over the past 20 years, and now runs a venture company that supports the effort.  Noting past failures and the scope and breadth of computing’s next frontier, quantum computing, he notes that “you have to build an industry.”  The importance of being nimble, close to customers and constantly moving forward “can’t be done with just one company” says Lazaridis.

Companies including Google, IBM and others are also chomping at the quantum bit, and so Lazaridis has chosen to make his well-placed, narrower venture bets on companies and technologies that could be commercialized in just a few years.

That’s important because quantum (as we’ve written about in this blog several times before) is tricky.  While classical computers handle their information bits as 1s and 0s, in quantum, a bit can be both a one and zero at the same time, enabling a level of multi-tasking previously unthinkable.  The potential, in fields as diverse as weather, aviation and warfare, is enormous.  But quantum as it exists today is still on shaky ground.  The existing number of quantum computers is small, and as Bloomberg BusinessWeek reports in a recent article, “they become error-prone after mere fractions of a second – and researchers say perfecting them could take decades.”

Hence the importance of aiming carefully.  Several of Lazaridis’ investments have come to market, or are close.  Isara Corp. sells security software it says can block quantum hacks and projects sales of $3M in 2018. High Q Technologies claims that by year-end it will be selling quantum sensors 100,000 times more sensitive than the tools pharmaceutical companies use today to develop drugs. (Our featured photo today is of a device used to test the superconducting films used on silicon at the atomic level.)

Lazaridis has teamed with former Blackberry teams to connect quantum computers with conventional computers, in order to make quantum more accessible to a wider audience.  Those efforts will still need to prove themselves viable as businesses, but the mere idea reinforces the industry certainty that the current state of computing will not remain the status quo, and that the future of computing is quantum.

It’s a race, he knows.  One driven by venture capital and the ability to put one’s money where one’s mouth is.


Our friends at Insight Works, a provider of automated warehouse and software solutions, have pointed out the most common ways companies lose money through warehouse and inventory mistakes.  Considering that these mistakes are common, and that the U.S. Bureau of Labor Statistics says that the number of U.S. warehouses has risen 15% since 2010, they are worth sharing.

Overstocking.  It’s an expensive issue for all supply chain operations.  Statistics show that U.S. companies are sitting on $1.43 of inventory for every $1 in sales – and that’s too much.  Errors that range from incorrect bin labeling and putaway procedures, to lack of oversight or pure laziness are all contributors.  But the bottom line is a considerable amount of tied-up working capital which, if recaptured, represents a sizeable bump in the bottom line.

That’s where automation helps, and often, very quickly once implemented.  A warehouse management system can automate stocking processes and ensure accuracy in stock counts, while also supporting managers with insights into optimal time to restock.

Mispicks-and Mis-shipments.  A single mis-pick costs, on average, about $22, and the average U.S. company has been shown to lose $390,000 annually due to mis-picks (yes, this obviously includes some pretty large firms, but not exclusively).

Here again, automation pays.  A robust warehouse management system can pinpoint and address inefficiencies. Barcode scanners help reduce the chances of mispicks and mis-shipments by eliminating risks associated with manual data entry.  Any picking errors are identified instantly by the barcode scanner, and incorrect items never make it to shipping, let alone customers.

Inventory count errors.  Mistakes in cycle counts, prevalent in manual or paper-based systems, hurt efficiency and drive up mistakes of either too much, or occasionally, too little inventory.

As the folks at Insights Works note… While inventory counts are undertaken to help support accurate inventory records, manual counts are time-consuming and prone to error.  A worker may, for example, accidentally group differently sized items together and count them as the same size, resulting in an incorrect count.  Replacing these manual processes with an automated system that leverages barcode scanners can reduce time spent on cycle and inventory counts and cut down on errors.  Best of all, when items are scanned and counted, data is automatically added to the inventory management system, further reducing administrative tasks and supporting accuracy without extra work.

Picking time.  Today’s WMS systems can direct workers on the most efficient routes to make their picks when filling orders.  Some can even provide optimal picking order based on certain order characteristics.  Workers save time, increase pick rates, and substantially reduce picking errors compared to manual systems.  This helps make even new, less experienced workers more efficient in short order.

Incoming inventory errors.  Some warehouses still use time-consuming and error-prone manual processes for counting and reconciling incoming shipments, which delay outgoing orders awaiting updated inventory data.  Instead, workers can leverage barcode scanning to more quickly receive and verify incoming shipments because scanned data is automatically sent to the warehouse management system — no extra steps needed. This supports speed and accuracy, and ensures that human errors related to incoming inventory are avoided.


The bottom line is this: Automated warehouse management systems have been around for years, and they greatly improve the efficiencies of all warehouse operations.  In fact, probably no other ERP automation component provides a faster payback.  If you’re not already fully automated ‘out back,’ perhaps it’s time you asked yourself why.


A few tips from Panorama Consulting are worth considering before a company attempts to convert to a new business system, which we thought we’d reprise for readers today.  They’re culled from the original post here.


To position yourself for success they suggest…

  1. Validate the scope and timing of your project. This is the basic stuff: Make sure you’ve scoped out the right modules, the right number of users and the right types of users (for example, ‘full users’ can be costly while shop-floor users are usually purchased for considerably less).  Ask questions to see what add-ins might have been used (report writers, 3rd party aps, etc.) during the demo you saw.  Be sure you purchase the right licensing type, and know what maintenance costs will be each year for the whole implementation going forward.  Remember too, you don’t have to buy all the software upfront.  (We often tell users to buy just what they need initially, which saves short-term purchase dollars, and saves on unneeded (for now) maintenance costs.)
  2. Source your internal and external implementation project resources. While a vendor may be anxious to get started (and make a sale), be sure you have your ducks in a row; that is, be sure you have the right team in place, including business analysts, knowledgeable managers, IT, shop floor, etc.  Roles and responsibilities need to be assigned early on, and project leaders at both client and vendor teams need to be clearly delineated.
  3. Build a complete project strategy and plan. Beyond what your consultants or vendor can do for you, project implementation requires careful planning and attention to a lot of details.  Be clear about every step of your processes and workflows, or it will trip you up during implementation when you can least afford it.  With your provider, draw a map of all key workflows so that together you can determine how they’ll fit, both software-wise and process-wise, into your new system.
  4. Begin key implementation critical path activities. Delays related to people and processes will derail an ERP project far more often than technical IT or software issues.  Where will the data come from (transfer or key-in)?… Who will be trained, and when, where?… Who do you turn to with workflow questions during implementation?… Who approves changes?  Consider these sorts of critical questions before you start down the road of full-on implementation.
  5. Define your project charter. Have you set out clear project and structure ‘governance?’  A project charter that includes plans, roles, responsible managers and stakeholders ensures that when the inevitable questions do arise, a clear chart of responsibility will make answering those questions clearer and better defined.  And be honest: companies only go through these kinds of projects every ten years or so – so don’t expect to be experts.  By all means, work with your provider to help answer some of these questions and to provide templates or examples from their prior implementation experiences.  Your provider should be able to provide a lot more than just selection expertise.  They should also understand about business, project strategy and the dos and don’ts of successful implementations.  So don’t be afraid to ask them.


Our firm provides specialty software for the printing industry from a company called PrintVis.  Their print-industry specific solution consists of a wide range of added functionality that is baked into Microsoft  Dynamics NAV (as they say, “You can buy NAV without PrintVis, but you can’t buy PrintVis without NAV”).  That functionality has enabled them to automate close to 400 printing operations from the largest to the smallest, in dozens of countries around the world.  So we thought we’d take advantage of their expertise by sharing their recent post (#134 in a series…) on the NAV Change Log, courtesy of a consultant by the name of Doug Wiley, who wrote the post for them.  Excerpts follow…

In essence, the Change Log is exactly what it sounds like: a log of changes that are made in your database. Essentially, there are two types of “audit trails” in PrintVis: those that track transactional data (like inventory movements or changes to the G/L) and those that track changes to things like master records.

The first type is always on, and can’t be shut off. The second type is the Change Log, and that needs to be configured and turned on manually.

Turning it on is easy. You just search for “Change Log Setup” and open the window. There is a single check box which turns on the Change Log (in red).  The more complex setup is in the background, which can be reached by clicking “Tables” in the ribbon in the window above (in green). Here you will determine which tables, and which fields within those tables, you would like to track. You have the option to track “Insertion” (adding a record), “Modification” and “Deletion” by table.

Also notice for that each of these you have the option to turn it off “blank” (even if the change log itself is turned on), track all fields, track some fields, or to select the fields you’d like to track.

The interface to this functionality is relatively simple, but the way in which it’s configured is where the nuance and decision-making come in.  In the past, it was always recommended that the Change Log be used sparingly, to prevent the size of the database increasing too much when people altered records. Now that disk space has gotten more abundant, and cheaper, you can err more on the side of using it, but there is still a good reason to plan well: If you want to find who made a change, to what, and when, you will still need to sort through all of the changes that have been logged.

Fortunately, NAV has excellent filtering and sorting tools which will get you what you want – but still there’s no need to capture a bunch of changes which don’t really matter that much from an operational standpoint.  Picking which fields in which tables you’d like to track is step one of this process. For example, you probably want to know if a customer’s payment terms change, but not so much if their main contact phone number or email address does. You may want to track if someone changes your posting group setup (which drives all your accounting and financial reporting), but not if someone changes the name of a journal batch. Your consultant will help walk you through this process and give examples of best practices.

The next step is choosing when you want to track these changes. Obviously turning this on when setting up a new database would be madness, since every change made by every new record imported would generate an unnecessary entry. Generally, it’s recommended to only turn it on once final setups and master data have been approved and put into the database.

The third and final step is maintaining which changes are tracked. For example, if for some reason there needs to be a mass update to your cost centers, you probably want to turn this off while that happens to avoid generating a bunch of data (and remember to turn it back on!).

Our thanks to Doug Wiley for pointing out these Change Log tips, which we will hope will help our NAV users get even more out of this most powerful and robust ERP system.


Last year a U.S. intelligence sector contest — to see who could develop the best AI algorithm for identifying surveillance images among millions of photos — was won, not by an American firm, but by a small Chinese company.  The result served perhaps as a warning shot that the race to dominate the realm of artificial intelligence, or AI, is on – and American victory is by no means assured.  (This post is based on reporting by 3 reporters in the Jan 23rd Wall Street Journal.)

The Chinese company, Yitu Tech beat out 15 rivals due to one big advantage: its access to Chinese security databases of millions of people on which it could train and test its AI algorithms.  Privacy rights in the U.S. make that particular sort of application harder to develop, as companies lack access to the enormous trove of these data common in China, and often accomplished with the aid of government agencies.

AI algorithm development depends on vast troves of data to develop and test their not always obvious hypotheses and angles.  Microsoft chief legal officer Brad Smith notes that “The question is whether privacy laws will constrict AI development or use in some parts of the world.”

The U.S. leaders in AI include Apple, Alphabet (Google), Amazon, Facebook and Microsoft, and they’re up against Chinese giants like Alibaba, Baidu and Tencent Holdings.  On the academic side, the U.S., particularly in the regions surrounding Silicon Valley, holds a strong lead, in terms of budgets and in patents filed in areas like neural networks and something called unsupervised learning.  The U.S. outnumbers China in terms of number s of AI companies by about two to one.  As well, current spending on AI in the U.S. is massive, with investments of over $13 billion in R&D each from Microsoft and Alphabet.  By comparison, Alibaba only recent pledged to triple its R&D spending to $5 billion over the next three years.

But China plans to close the gap with a new government led AI effort to lead the field by 2030 in areas including medicine, agriculture and the military.  Thus, AI startups in China are seeing a tenfold rise in funding over last year.  PwC expects China to catch up and reap about 46% of the $15.7 trillion it expects AI to contribute to global output by 2030, with North America a distant second at about half that percentage.

In the West there is a general reluctance to give companies wide use of customers’ data.  Even tough U.S. laws and restrictions are still weaker than those in Europe, where privacy rights have become even more contentious, and where new tougher laws are scheduled to take effect in May.  Some experts think this reluctance could hamper U.S. AI development efforts and allow Chinese companies to pull ahead in the race for global AI dominance.

Regulation “could be a help or a detriment,” according to former Google and Baidu exec Andrew Ng, who recently founded an AI startup.  He adds that “Despite the U.S.’s current lead in basic AI research, it can be easily squandered in just a few years if the U.S. makes bad decisions.”

The race is indeed on.

We noticed a great closing article in this quarter’s issue of APICS Magazine, by Randall Schaefer, a CPIM and retired consultant, on how he once described his daily routine to his boss, and how it became an inspiration to others.  We’ll reprise his story here, from the Jan-Mar 2018 issue.

Schaefer’s career covered 50 years in supply chain beginning in the 1960s.  He recalls a time of no computers and resistance to procedures and discipline.  Then came computers, at least for accounting.  By the 80s he found managers finally embracing technology.  Then at a new organization, Schaefer endured his first performance review and found himself “on the wrong side of the company’s expectations.”

Apparently, he notes, the general consensus was that he didn’t do anything.  Subordinates and superiors all agreed, he notes.  Now in truth, Schaefer points out that he “trained my subordinates well and… brought our department’s metrics to all-time highs.”  But people noticed that he was not stressed out or always resolving some minor disaster.  They decided he wasn’t busy enough.

When his boss asked Schaefer to describe his daily routine, he told him he had none.  His style was to ensure that subordinates were following the disciplines and processes he’d put in place.  His only routine was to continually assess whether those processes and procedures were still valid.

His manager thought a manager ought to personally handle more tasks, rather than delegating.  But as he notes, his results were undeniable.  Thus, notes Schaefer, “he could only advise me to change my ways” and “I disregarded his advice.”

A year later, his metrics were even better.  His boss hated it.  He was changing the expectations.

Years later he applied for a job in the automotive industry.  In an interview with the president, Schaefer was asked to describe his management style.  He decided to play it straight he says, even if it might kill his chances.  “I don’t do anything,” he told him.  Then he says he smiled and added, “At least, that’s how it appears to others.  I learn every process, procedure and discipline in my areas of responsibility.  I teach each one to my subordinates and expect them to be followed routinely.”  They knew to come to him if something wasn’t right so he could fix those things.  In short, he noted “I have been very successful and would like to continue this success at your company.”

The president smiled and nodded, saying “I agree with you.  The busiest-looking managers rarely get the best results.”  And then he offered Schaefer the job.

On his first day, the president asked Schaefer to write down the description of his management style as he had shared it in his interview.  The president wanted, he said, to memorize it because an old boss of his had also once accused him of not doing anything.

And that may be the best definition of what a manager “does” that we have ever heard.