Feeds:
Posts
Comments

Archive for the ‘Software, Technology, and Wow I Didn't Know That’ Category

It was nearly 30 years ago that a Dutchman by the name Guido van Rossum set out to create a new programming language that would adhere to three principles: First, it should be easy to read; braces and dangling punctuation marks would be replaced by indented white space surrounding fairly readable code words.  Second, it should let users create their own modules or objects that would be usable by others to form the basis of new programs.  And third, he thought, it should have a catchy name.  And so he named it after the English comedy troupe, Monty Python.  The code package repositories became known as “the Cheese Shop.”

Little did von Rossum know that one day his creation – Python — would become the most popular language, said to have received more Google search hits in the past year than Kim Kardashian, according to the editors at The Economist.  Today, Python is used by nearly 40% of professional developers (another 25% wish to do so according to a programming forum).  It’s also popular with ‘ordinary folk’ as well, notes The Economist, and is snaring youthful adherents as well, noting that about 40% of American schools now offer computer programming (compared to about 10% just five years ago) and among them, about two-thirds of 10 to 12 year olds have an account on Code.org’s website.

Mr. van Rossum recently stepped down from his longtime role as curator and supervisor of Python (“benevolent dictator for life,” he once called it) saying he has long been uncomfortable with the fame.

Like all languages, Python is not perfect.  The more traditional C and C++ languages offer a broader suite of lower-level functionality (lower-level usually meaning that you are closer to the machine’s own language, and thus have more control over commands and overall language functionality).  Java is popular for large and complex apps, and JavaScript is the language of choice generally for apps accessed by a web browser.

But Python’s simple syntax makes it easier to learn to code, and share.  It’s used everywhere from the CIA for hacking to Pixar for producing films, in Google web page crawlers and in Spotify’s recommended songs feature.  It’s also recently said to have become a language of choice for AI researchers, which means it’s truly here to stay.

And even the non-coders of the world have taken it up in non-technical jobs, where marketers use it to build statistical models for their campaigns and teachers can see if they are properly distributing grades.  CitiGroup has introduced a course in Python for its trainee analysts, and it’s a haven for those who have long relied on spreadsheets for data analysis.

While no computing language can ever be all things to all people, specialization of languages matters, and so Python just keeps growing.  Over the history of computing, any number of languages have grown, dominated and then faded: think Fortran in the early days, then Basic’s many versions throughout the PC’s heyday, and even Pascal, a presumed lingua franca of the PC-era computing lexicon.  Presumably, Python’s day will fade one day as well, but now nearly 30 years after its birth, that day remains anyone’s guess – and probably in the distant future.

Read Full Post »

Recently The Economist took on the issue of how the internet has changed and what they think can be done about it.  As an article called “The Ins and Outs” from June 30th 2018 (multiple authors) notes:

“Until a few years ago most users, asked what they thought of the internet, would have rattled off a list of things they love about it – that it lets them stay in touch with friends, provides instant access to a huge range of information, sparks innovation, even helps to undermine authoritarian regimes.”

Early cyber-gurus like Tim Berners Lee and Vint Cerf hoped to create a system “biased in favor of decentralization of power and freedom to act,” according to Harvard Professor Yochai Benkler.  When the first internet message was launched on October 29, 1969, a new era of unfettered openness appeared to be on the horizon.

Today, The Economist suggests, there is much disenchantment over the ways in which today’s net has become so much more centralized – and is becoming increasingly so.

In the West, the internet is dominated by a few giants like Facebook and Google, and in China by firms like Baidu, Alibaba and Tencent.  As historian Niall Ferguson argues, this pattern – a disruptive new network being infiltrated by a new hierarchy – has many historical precedents, from the invention of the printing press through the time of the Industrial Revolution.

The internet has become much more strictly controlled, they note, pointing out that when users were mainly accessing via desktop or laptop computers, they could stumble across amazing new services and try many things for themselves.  These days however, more folks get online via phone and tablets that confine users “to carefully circumscribed spaces, which are hardly more exciting than television.”

In the early days, Vint Cerf and others worked to make the internet largely “permissionless.”  Any computer and any network could join in if it followed the right protocols.  Packets of data were handed from one network to another, regardless of contents.  This looseness is what caused Tim Berners Lee to come up with the basic idea for the world wide web, which worked on top of the internet’s basic structure.

“These protocols were complemented by a set of organizations that allowed the rules to evolve, along with the software that puts them into effect” and kept them from being hijacked by outside interests.  Most notable among these was the Internet Engineering Task Force, whose founder David Clark put forth the philosophy that probably best summed up the early innovators’ intent: “We reject: kings, presidents and voting.  We believe in: rough consensus and running code.”

This openness and flexible governance set off a flurry of innovation and activity and by the mid-1990s millions of websites were up and running, and thousands of startups were launched.  Even after the dotcom bubble collapse around 2000, decentralized activity like blogs and forums continued unabated, with links to one another creating an ever-evolving ecosystem.

Today the connections to transfer data still exist, as to ever-growing reams of data, much of it now residing in cloud-computing silos within mega-companies like Google and Facebook, all closely monitored and metered, while billions of smartphones tap their available – and closely curated and counted – content.  Someone is always watching.  And most likely, monetizing off your data.

Whether it’s due to the workings of an authoritarian country or a monopolistic company, the internet’s protocols, governance and controls have swiftly and subtly evolved, and they are never going back.

[The original article in the June 30, 2018 Economist has many more sections and is worth a read.  We’ve reviewed and rendered here only a small portion today due to space constraints.  Again, it’s worth a read.]

 

Read Full Post »

Assuming you’ve digested our prior post on blockchain basics and its importance in creating secure transactions, we’ll look now in this second post at some applications that are aimed at proving blockchain’s value to users everywhere

We noted earlier that blockchain provides secure digitally-signed access to transactions to all members of the chain.  The data is distributed across a network of servers rather than just a single server, thus making it transparent, immutable and secure for reasons touted earlier.  So then, what can we do with this new framework that will make a difference in our own lives?  Let’s look at a few examples, as provided in a recent article by Nir Ksahetri, a business professor at the Univ. of North Carolina, in a recent article in The Wall Street Journal.

Let’s start with distribution.  A number of companies including Cisco Systems, Bosch and Bank of New York Mellon have banded together to create a blockchain-based IoT security standard, to bind together weaker IoT identities like serial numbers, barcodes, UPCs and QR codes into stronger crypto-graphic entities.  Widespread use would allow device makers to securely distribute updates and patches, even if a device is moved or sold (since all that information would be part of the blockchain transaction record).  Manufacturers can be sure they’re communicating with the right dev ice.

In health care and banking, providers today store your personal information and we as consumers have little control over who sees it or shares it.  With blockchain, entire medical records or banking records can be stored in encrypted ledgers with the patient’s private-key.  Changes to the record can be communicated via public keys and providers with permission can see the data with patient or customer authorization and permission, but they can’t store it.

In developing countries, land fraud due to administrative corruption is a problem.   Corrupt officials alter property records to benefit themselves in exchange for bribes, creating land fraud.  With blockchain, if a property changes ownership, the transaction record reflects time, location, price and parties involved.  Government agencies can authenticate the title information when entered, and law enforcement agencies can inspect documents to enforce compliance with “know-your-customer” and anti-money laundering policies.  Blockchain of course also protects against unauthorized access to data and the owner controls the ultimate (private-key) record.

All these applications, and many more, are being (or have been) developed utilizing blockchain technology, and we’re only at the beginning.  While challenges loom, like bringing down costs at the IoT and labeling level, creating wider user-community acceptance and providing better communication to decision makers of blockchain’s many benefits, it’s all coming.  Already nearly four in ten companies (of 369) surveyed a year ago by the Journal were deploying or considering deployment of blockchain technology.  It’s only a matter of time.

 

Read Full Post »

The Internet of Things promises to transform the way humans, through their machines, interact with the Internet, and when you think about it, it’s already here.  Manufacturing is a case in point.

Today, sensors attached to shop floor equipment are capable of sending reams of production and equipment data back to the ERP systems, which hold that data in a repository for future use or analysis.

Handheld scanners are used to track the movement of parts, pieces and production.  Beyond that, they also aid warehouse workers in the movement — the picking, packing and shipping – of countless SKUs.  Everyone from the folks walking the floor to the folks in the front and back offices can have access to the same production and inventory information, in real time, at the same time.

Smartphones in the customer service and CRM arenas have made possible up-to-date information on all manner of data, from customer sales and orders to inventory quantities on-hand to sales reports across the territory or across the globe.  From checking into hotels to allowing physicians to check in on patient records from their phones and tablets, the interconnectedness of machines, ERP and the web has reached critical mass.

It’s the full-flowering of Bill Gates’ long-ago promise of IAYF: Information At Your Fingertips.

By 2020 according to one industry analyst, fully 95% of products are expected to be IoT enabled.  (We think that’s a bit optimistic, but their point is well taken nonetheless.)

All these advances have their advantages of course.  They save time – lots of it.  They save money – again, lots of it.  They speed up delivery, improve customer responsiveness, enable customers to become self-serving, and generally raise the level of satisfaction among a wide range of customers and their supporting companies.

For companies today frankly, there is little choice any more: adopt and adapt, or be left behind.  Sometimes the toughest question becomes: where do we start?  Luckily, there’s no shortage of consultants and solution providers willing and capable of providing the necessary guidance to get started.  About the only thing you can’t do any more is… wait.

 

 

Read Full Post »

Thousands of information security jobs are currently going unfilled in the U.S. at such a rapid pace that by the year 2025 it is estimated that the demand for security workers will outstrip the supply by 265,000 jobs, according to consultants at market research firm Frost & Sullivan.  Considering the high pay offered in the field (about $10,000 per month for a data-security analyst), that’s surprising.

Companies in the field are even willing to provide training and educational assistance to people with the right mix of ambition and talent, says John Simons, a reporter for The Wall Street Journal.  Degrees aren’t what are required to get a foot in the door, according to insiders.  What matters more is whether candidates can demonstrate knowledge of computer networks, programming and critical thinking, according to Ryan Sutton, a tech recruiter for Robert Half.  He notes a lack of certified professionals in the field compared to the need out there today.  Oh, and contacts help too.

The Computer Technology Industry Association (CompTIA) has three I.T. security certificates recognized by hiring managers, and it offers a general course on its website for “would-be cybersecurity analysts” that covers some of the basics like network security, compliance, threats and vulnerabilities and the like.  More advanced credentials, worthy of endorsements by the U.S. Dept. of Defense and the NSA include the CISSP, or Certified Information Systems Security Professional.

But certifications aren’t everything.  Job candidates who draw the most attention are those who can demonstrate an ability “to think like malicious hackers” according to Dan Miessler of IOActive Inc. a Seattle-based cybersecurity company.  Instead of making claims of what you can do, show examples of your work online, suggests Miessler.

Many interviewed firms indicated that they include extensive training as part of their offerings.

Of course, while jobs may be plentiful, it still often comes down to who you know.  Anna Friedley is a cyber risk analyst who found her job not because of her degrees in math and later library science or even her master’s degree in high-tech crime investigation.  No, instead she says it came from her love of knitting, where a friend who shared her love of knitting happened to ask her to meet for coffee and mentioned that her office had an opening, and could she come take their test and submit her resume.

Still, the skill set probably didn’t hurt.

 

 

 

Read Full Post »

On more than one occasion we’ve had to help customers who’ve been hacked for ransom to get things as back to normal as we can.  Frankly, that’s not even what we do for a living, but hacking, phishing and general cybersecurity issues are so prevalent these days that none of us can avoid dealing with them at some level.

And for that reason, none of us can afford to ignore them.

Recently, The Wall Street Journal’s Chris Kornelis interviewed Andreas Luning, founder of Germany’s G Data Software, one of the first publishers of an anti-virus software (named Anti-Virus Kit) which Luning’s firm released thirty years ago.  That’s about how long viruses have been an issue.

When Kornelis asked Luning what’s different today, that is to say… “What does the public still not understand about viruses and cybersecurity?”… Luning responded: “The speed.”

He went on to say that “People can’t see or get an awareness of what computers can do in milliseconds.”  He added that if you get a “good computer virus” that tries to steal data or accumulate money… you won’t see this virus on your computer.  They work in the background – no sirens or alarms he notes – and they do everything to keep what they do in the background.  Thus, you have “no chance to see if your computer is affected by something.”

This, from a guy who has been dealing with this stuff since 1987 (the year our own company came to life), and even before there was an internet.  Luning got his first virus, he says, from an Atari gaming disk, and it was a miniscule 400 bytes.  It made itself persistent in memory and eventually copied itself on to all his other disks.  This, he says, “made me feel uncomfortable.”  He and a partner eventually found a way to detect the virus code and as a result, a company was launched.

Back then, Luning notes, the hackers just wanted to see how far they could go, what they could get away with.  They might go so far as to flicker your screen or maybe even start to crash your computer.  Mostly, it was slightly nefarious programmer-hackers just showing off.

However, viruses went from being silly to dangerous in the late 90’s, and there’s been no let-up ever since.  Today, criminal-minded people don’t even need to be hackers any more.  They can just exploit things found on the dark net, and in ready-to-use clickable baits for creating ransomware.  You don’t even need to be technical any more.  Just criminal.

So the next time you consider whether or not to purchase and/or update your anti-virus software, just remember that Andreas Luning has warned you.

 

 

Read Full Post »

We’re hearing more and more these days about the advent of the new 5G cellular technology.  While previous iterations of the first, second, third and even mostly fourth gen cell tech have been primarily about how we use our phones or stream movies, 5G is going to be a game changer.  And since we here at the blog are all about business, technology, software and the future… we thought we’d share a few thoughts published by the editors of The Wall Street Journal in the May 2018 booklet entitled “The Future of Everything.”

As the editors note, “5G has the potential to dramatically reshape our lives.”  Following then are some of the impacts they see coming to all of us, thanks to the impending 5G network upgrades that will be coming to a town near you – and sooner than you think!

  1. 5G gives everything from cars and homes to drones and medical equipment instant access to the internet. This will extend wireless technology beyond our phones and “radically enhance machine-to-machine connections.”  5G will be available in dozens of U.S. cities by 2018, and is slated to roll out nationwide by 2020.
  2. Want to watch a movie? Today it takes about four minutes to download a move on a 4G network.  With 5G, you’ll have it on your tablet, phone or smart TV in six seconds.  And those 5G speeds will also allow theme-park visitors with connected headsets to stream hi-def virtual reality experiences while on a speeding roller coaster.  You go first.
  3. 5G makes self-driving cars a practical reality, with safety. 5G-equipped cars will see, know and understand their surroundings instantly, alerting their vehicles to, say, accidents ahead and perhaps even averting those dreaded pile-ups.  An ambulance can signal those cars to pull over long before a human driver could react to its siren, say the editors.
  4. The combination of cloud with real-time video and analytics will allow cities to better manage everything from power grids to traffic patterns. Sensors in water systems could detect and fix leaks before a break occurs, and smart streetlights could direct cars to parking spaces.  Energy monitoring will lead to reduced power usage and improved air quality.
  5. But as the Journal’s editors warn in their concluding comments, resources will be called for to make it all happen. Governments manage spectrum, and there simply isn’t enough high-powered spectrum allocations currently to bring 5G everywhere it should be.  Without bandwidth, “we’ll be bottlenecking 5G’s game-changing speed and capping its potential.”  Governments need to allocate that spectrum ASAP, so industry can begin new network deployments, and make universal 5G services a reality.

Somehow, it always comes back to the government, doesn’t it?

Read Full Post »

Older Posts »