Feeds:
Posts
Comments

Archive for the ‘Bidness, and Other Current Economic Realities’ Category

Recently The Wall Street Journal ran an article we thought worth sharing about “what research tells us about effectively taming your inbox, when to use all caps, whether to use emoticons, how quickly to respond to message – and much more.”

(Well, we guess, that should cover it.)  Here’s what they found…

  • Replying to email promptly? Not always a good thing.  In companies whose cultures “emphasize speed of response, workers are more stressed, less productive, more reactive and less likely to think strategically.”
  • Handling email after hours? Also detrimental, the Journal report opines.  While some may feel more pressure to respond, those who do aren’t necessarily more efficient – they simply generate a higher volume of mail without actually getting more work done.
  • On the other hand… findings from a study of extroverts suggest that when they are working on routine tasks, “being interrupted by an email notification might be good for them – the social stimulation… may help avoid boredom and complete tasks more efficiently.”
  • When’s the best time to send an email? Studies show that when faced with a screen packed with information, people focus on what’s on top, so you want your email to correspond to when people are checking.  Based on a study of 16 billion emails, it was found that people “replied more quickly early in the week, and those replies were also longer. “ The same applied to time of day – between 8:00 AM and noon was best.  Apparently then, your best bet is to fire off your most important missives on a Monday morning.
  • What about email as a negotiation tool? Here, take advantage of email’s strengths, as most would agree that as a negotiating tool it pales in comparison to the face-to-face meeting, right?  Email’s strengths include ”the ability to rehearse what to say and convey a lot of information in a clear specific form that people can refer back to later on.”  As one researcher said, “if you understand how to use email effectively, it can be very helpful for your negotiations.”
  • SOME all caps is fine. It’s a long held tenet of email that using ALL CAPS is shouting!  But research says that’s not always right.  When used judiciously a word or two in caps can provide emphasis, communicate urgency or inject humor.  Just don’t do your whole email that way.
  • What about emoticons? Turns out, those little faces and pictures have been shown to help with comprehension, they shave a bit of negativity out of a message (or add a note of positivity), and are fine to use with people you know.  The caveat is to avoid using them in the wrong circumstances, such as in an introductory business email that sets the wrong first impression in a business context.  People may view you as ‘less competent’ and will thus be less likely to share information with you.
  • Take the time to view messages from the other person’s perspective. Research found that people are “consistently overconfident in their ability both to understand emotion in email and to convey it.”  Instead of skimming emails and firing off quick responses, they say you should take the extra time to view those exchanges from the other person’s perspective.

Good email communication, the Journal concludes, “is not about our intentions, but about the meaning that other people assign to what we write.”  In other words, the way people read your email might be different from how you thought you wrote it.  It happens… all the time.  They suggest asking yourself, “This is what I meant, but is this what the other person will get?”

 

Read Full Post »

We noted in our prior post that underlying the cryptocurrency called “bitcoin” is what, in the long run, may be the more important element at play here: the blockchain.  Our prior post quotes The Wall Street Journal’s Christopher Mims’ fine explanation of the concept.  Now we’ll look at some important applications of blockchain technology.

In logistics, Walmart already uses a blockchain to list for sale over a million items, including chicken and almond milk, that provides its supply chain with traceability all the way forward and backward from source to sale.

Global shipper Maersk uses IBM’s blockchain technology to track shipping containers and move them through customs faster.

Both efforts are expanding rapidly, and other companies cited by Mims include Kroger, Nestle,Tyson Foods and Unilever.

A company called Everledger was started in 2014 with the intent of creating a blockchain that traces every certified diamond in the world.  It already has over 2 million diamonds in its registry, and adds another million or so per year.  Everledger records 40 measures of each stone, lending it traceability “from when it’s pulled from the earth to the day it’s purchased by a consumer.”  Every participant in that chain from miner to retailer maintains a node with a copy of the database in the blockchain.

A company in Israel puts internet-connected sensors on pallets and uses business intelligence analytics to determine when and where items could be damaged.  Blockchain participants can record every stage of the package’s journey via package, pallet and shipping container.

Even whole countries are adopting blockchain.  Dubai intends to be “the first blockchain powered government in the world by 2020.”  By moving its central record of all real estate transactions onto a blockchain, it will be faster and easier to transfer property titles, for example.

As blockchain technology becomes more widely accepted and integrated into supply chains, it has the potential, as Mims notes, to be a “fundamental enabling technology,” similar to how new data transmission standards across networks made the internet we know today possible.  It could one day underlie everything from “how we vote to whom we connect with online to what we buy.”

That being said, it’s wise to recognize that the current bitcoin craze is merely one application of the blockchain technology.  Clearly, much more will be, and is, possible through blockchain.  Bitcoin may — or may not — be here to stay; but blockchain seems to have all the merits and rapid adoption of a technological foundation that could change the way businesses run.

 

Read Full Post »

With all the hype surrounding bitcoin these days making it sound more and more like a modern-day equivalent of the 17th century tulip bulb mania, it’s important to remember that there actually is something important going on here.  And it’s not about the bitcoin.  It’s about the underlying technology for bitcoin – the blockchain.

Investment manias may come and go, and bitcoin will likely make some folks rich (it already has for those who bought bitcoin at the start of 2017 at $963 and watched its price soar to nearly $20,000 by year-end; it’s since fallen back to around $8,300 as of this writing), and likely leave some ‘greater fools’ broke a little further down the line.  After all, bitcoin has no intrinsic value, it’s not based economically on anything, and its essential value is merely the result of what some other person is willing to pay for it.  As a currency proxy, it has a ways to go.

But the blockchain that bitcoin is built upon – that’s another thing.  And a recent article by Christopher Mims in The Wall Street Journal provides some of the best explanation we’ve seen for why it matters.

What is a blockchain?  As Mims explains:

“It’s essentially a secure database, or ledger, spread across multiple computers.  Everyone has the same record of all transactions, so tampering with one instance of it is pointless.”

He goes on to explain that the underlying cryptography…

“…allows agents to securely interact – transfer assets, for example – while guaranteeing that once a transaction has been made the blockchain remains at immutable record of it.”

Blockchain has the power to transform industries for three reasons, notes Mims.

First, it’s well-suited to transactions that require trust and a permanent record.

Second, blockchain requires the cooperation of many different third parties.

And third is… the hype.  “The excitement around cryptocurrency gives blockchain the visibility to attract developers and encourage adoption.”  In this way, blockchain resembles the cloud, which also gave many industries “new business processes, disruptive startups and new divisions within existing companies, an ecosystem of supporting technologies, and new ways to charge for services.”

We’ll take a look at some of that disruption in our concluding post, so stay tuned.

Read Full Post »

We started this series of three posts, concluding today, with the story of “Lena,” the so-called first lady of the internet, named for a former Playboy model whose image in 1972 became the gold standard of sorts for compression algorithms used in the efficient transfer of digital images.  In the follow-up post (here) we noted the paucity of women in programming at that time, and the study that led to the shift towards an intrinsic bias towards men that has dominated the programming employment picture for decades.  Those posts were based on the work of Emily Chang in an article for Bloomberg BusinessWeek and a new book entitled Brotopia: Breaking Up the Boys Club of Silicon Valley.  Today we’ll conclude with some of Ms. Chang’s thoughts on the recent past and the subject of women in tech.

Chang points out that in Google’s early days, founders Sergey Brin and Larry Page sought to hire women for key positions, and succeeded wildly when you consider that they brought on board Susan Wojcicki who helped build Google’s AdWords and AdSense, two products that formed what Chang calls the “near-perfect business model” that today drives Google’s $100 billion business.  They then brought on Sheryl Sandberg who had been chief of staff to Larry Summers at the U.S. Treasury to help transform Google’s new self-serve ad operation that’s now “bigger than any ad agency in the world.”  Today Sandberg is CFO at Facebook.  Finally, they brought in Marissa Mayer as a product manager for Google’s search page.  She would eventually become CEO at Yahoo.

But despite hiring some of the most powerful and successful women the tech industry has seen, by 2017 Google disclosed that only  31% of its employees were female, and only 25% of leadership roles and 20% of technical roles were filled by women.

The issue, according to Chang, has much to do with what happens when you start to scale hiring.  Industry standard recruiting models collectively feature many of the same school job fairs, the same recruiting websites and they subscribe to the same ‘questionable’ theories about what makes for a good engineer.  Google eventually concluded that the hiring velocity caused them not to be as ‘thoughtful’ about the hiring process, nor cast as wide a net, as they could.

Determined to make changes, in 2015 the newly rebranded Alphabet, Inc. hired several women to key positions and ended up with a management team that is 40 percent female.  As yet, none of the company heads of Google’s 13 key divisions are women – but still, it’s progress.

The lesson from the past though is clear: Women like Wojcicki, Mayer and Sandberg brought wider skill sets to the company in its earliest days, and they succeeded wildly as a company.  Notes Chang: “If subsequent managers at Google understood this lesson, that might have quieted the grumbling among engineers who had a narrow idea of what characteristics made for an ideal employee.  Google’s early success proved that diversity in the workplace needn’t be an act of altruism or an experiment in social engineering.  It was simply a good business decision.”

Read Full Post »

If you haven’t already, be sure to read our prior post before this one.  It’s the brief historical story of Lena, the early quality standard for algorithms that enable the transfer of digital images, and the precursor of today’s ubiquitous JPEG picture-file format.  If you know Lena’s original story, then please read on.  (Our post is excerpted from the work of Emily Chang of Bloomberg BusinessWeek and her new book “Brotopia: Breaking Up the Boys Club of Silicon Valley.”)

When Deanna Needell, now a math prof at UCLA, first encountered “Lena” in a computer science class, she quickly realized that the original image model was nude (she was culled from the pages of Playboy in 1972) and it made her realize, “Oh, I am the only woman here.  I am different.”  Needell says, “It made gender an issue for me where it wasn’t before.”

Her male colleagues, predictably, didn’t see the big deal.  Said one, “when you use a picture like that for so long, it’s not a person anymore, it’s just pixels,” in a statement that naively laid out the problem of sexism that Needell and her colleagues tried to point out.  But with so few women among the ranks of the programming class, it’s no surprise.

It wasn’t always that way.

As we’ve pointed out previously a post here, the early days of programming were predominantly fueled by women.  In that early, post-WWII era, programmers were mostly women, and the work was considered more of a clerical nature, and thus ‘better suited’ to women.  Only later, when the economy turned down and computers looked to be a key tool of the future, did men begin to enter the programming ranks, eventually even pushing women out as the image of computers and programming pivoted to something more suited to “introverts and antisocial nerds.”

In one pivotal study in the 1960s, two psychologists, William Cannon and Dallis Perry profiled 1,378 programmers, of whom by the way only 186 were women.  Their results formed the basis for a “vocational interest scale” they believed could predict “satisfaction” – and thus, success – in the field.  They concluded that people who liked solving various types of puzzles made for good programmers, and that made sense.

But then they drew a second conclusion, drawn, remember, from their mostly male sample size, in which they concluded that happy software engineers “shared one striking characteristic” according to Ms. Chang: They don’t like people.  They concluded in the end that programmers “dislike activities involving close personal interaction and are more interested in things than people.”  As Ms. Chang pointedly notes then… “There’s little evidence to suggest that antisocial people are more adept at math or computers.  Unfortunately, there’s a wealth of evidence to suggest that if you set out to hire antisocial nerds, you’ll wind up hiring a lot more men than women.”

So while in 1967 Cosmopolitan was letting it be known that “a girl senior systems analyst gets $20,000 – and up!” (equivalent to $150,000 today) and heralded women as ‘naturals’ at computer programming, by 1968, Cannon’s and Perry’s work had tech recruiters noting the “often egocentric, slightly neurotic, bordering on schizophrenic” demeanor of what was becoming a largely male cadre of coders, sporting “beards, sandals and other forms of nonconformity.”

Tests such as these remained the industry standard for decades, ensuring that eventually the ‘pop culture trope’ of the male nerd wound up putting computers on the boy’s side of the toy aisle.

By 1984, the year of Apple Inc.’s iconic “1984” Super Bowl commercial, the percentage of females earning degrees in computer science had peaked at 37%.  As the number of overall computer science degrees increased during the dot-com boom, notes Chang, “far more men than women filled those coveted seats,” and the percentage of women in the field would dramatically decline for the next 25 years.

We’ll finish out this series of posts with a look at the state of women in tech today and what that might mean for tomorrow, so stay tuned.

 

 

Read Full Post »

Emily Chang is a journalist and weekly anchor of Bloomberg BusinessWeek, and the author of “Brotopia: Breaking Up the Boys Club of Silicon Valley.”  Recently, she penned an article there about an old (in tech terms) digital artifact by the name of Lena Soderberg.  Lena first became famous in November 1972 when, as Lenna Sjooblom, she was featured as a centerfold in Playboy magazine.  That spread might have been the end of it but for the fact that researchers at the Univ. of Southern California computer lab were busy trying to digitize physical photographs into what would eventually become the JPEG (or .jpg) format we all know from Internet images today.

According to the lab’s co-founder, William Pratt, now 80, the group chose Lena’s portrait from a copy of Playboy brought to the lab by a student.  The team needed to test their photo-digitization algorithms on suitable photos for compressing large image files so they could be digitally transferred between devices.  Apparently their search led them to Lena, the 21 year old Swedish centerfold.  Go figure.

Lena ended up becoming famous in early engineering circles, and some refer to her as “the first lady of the Internet.”  Others see her as Silicon Valley’s original sin – the larger point of Ms. Chang’s article – but that’s a topic for another post.

Apparently, Lena’s photo was attractive from a technical perspective because the photo included, according to Pratt, “lots of high frequency detail that is difficult to code.”  That would include apparently her boots, boa and feathered hat.

According to Ms. Chang, for the next 45 years, Lena’s photo (seen at the top of this post), featuring her face and bare shoulder, served as “the benchmark for image processing quality for the teams working on Apple Inc.’s iPhone camera, Google Images, and pretty much every other tech product having anything to do with photos.”

To this day engineers joke that if you want your image compression algorithm to make the grade, it had better perform well on Lena.

So to a lot of male engineers, Lena thus became an amusing historical footnote. But to their female peers, it was seen as “just alienating.”  And it has a lot to do with some of the inborn gender biases that permeate the tech industry to this day, where the majority of employees are still male.

That’s a much longer extract from Emily Chang’s essay that we’ll try to sum up in a succeeding post.  Stay tuned…

Read Full Post »

Mike Lazaridis is justly famous and wealthy for the being the guy who co-invented the Blackberry, the first ‘must-have ‘personal digital assistant.  And Lazaridis says he “won’t be iPhoned again.”

With colleague Doug Fregin, Lazaridis has poured nearly half a billion dollars into projects involving quantum computing over the past 20 years, and now runs a venture company that supports the effort.  Noting past failures and the scope and breadth of computing’s next frontier, quantum computing, he notes that “you have to build an industry.”  The importance of being nimble, close to customers and constantly moving forward “can’t be done with just one company” says Lazaridis.

Companies including Google, IBM and others are also chomping at the quantum bit, and so Lazaridis has chosen to make his well-placed, narrower venture bets on companies and technologies that could be commercialized in just a few years.

That’s important because quantum (as we’ve written about in this blog several times before) is tricky.  While classical computers handle their information bits as 1s and 0s, in quantum, a bit can be both a one and zero at the same time, enabling a level of multi-tasking previously unthinkable.  The potential, in fields as diverse as weather, aviation and warfare, is enormous.  But quantum as it exists today is still on shaky ground.  The existing number of quantum computers is small, and as Bloomberg BusinessWeek reports in a recent article, “they become error-prone after mere fractions of a second – and researchers say perfecting them could take decades.”

Hence the importance of aiming carefully.  Several of Lazaridis’ investments have come to market, or are close.  Isara Corp. sells security software it says can block quantum hacks and projects sales of $3M in 2018. High Q Technologies claims that by year-end it will be selling quantum sensors 100,000 times more sensitive than the tools pharmaceutical companies use today to develop drugs. (Our featured photo today is of a device used to test the superconducting films used on silicon at the atomic level.)

Lazaridis has teamed with former Blackberry teams to connect quantum computers with conventional computers, in order to make quantum more accessible to a wider audience.  Those efforts will still need to prove themselves viable as businesses, but the mere idea reinforces the industry certainty that the current state of computing will not remain the status quo, and that the future of computing is quantum.

It’s a race, he knows.  One driven by venture capital and the ability to put one’s money where one’s mouth is.

 

Read Full Post »

Older Posts »