Tesla backlash shows misunderstanding of reality

-September 15, 2017

Upstart electric vehicle manufacturer Tesla is in hot water again...no, not for its unwisely named Autopilot system (although the heat's still on it for that, too), I'm talking about its undoubtedly well-intentioned but not universally well-received recent moves in Florida. In advance of Hurricane Irma, the company pushed an over-the-air update to some Model S and Model X vehicles registered to owners in forecasted affected areas, which temporarily gave the cars additional range so that those owners could further flee the incoming storm. To which naysayers made hay with an abundance of complaints: about the seeming overreach of our "corporate overlords", for example, along with the fundamental "fairness" of "overcharging" folks for differentiation driven by software versus hardware.

Here's the background: Tesla had been selling two different variants of the Model S and Model X, one with a 75 kWh-spec'd battery and the other (slightly less expensive) with the battery spec'd at 60 kWh. The latter models delivered about 30 (driver- and route-dependent) miles less range between recharges. In reality, however, both variants contained the exact same battery payload; the demarcation was completely controlled by software. And it's not like Tesla was hiding anything; it upfront told "60 kWh" vehicles' owners that for a few thousand bucks, at any point down the road (pun intended), they could pay the company to remote-upgrade their cars. The same upgrade concept applies for the faster-acceleration "Ludicrous Mode", for example, and the aforementioned Autopilot upgrade. And as a regular practice, the company pushes firmware updates to its deployed fleet to fix bugs and add features (of variable value):



Such "differentiation by software" is a well-known, longstanding phenomenon in the IC world, of course. Take CPUs, for example; via a combination of feature-tailored microcode (thankfully also used to fix bugs in the field) and fusible links blown-and-grown during pre- and post-package testing, a common sliver of silicon can be made to take the form of an abundance of unique product variants with different (off the top of my head):

  • Clock speeds (both base and "turbo") and over-clock capabilities
  • Cache sizes and types
  • Core counts
  • Features such as simultaneous multithreading (Intel's HyperThreading), floating point coprocessor support, out-of-order instruction execution capability, and the like


The same goes for graphics processors, whether implemented in a standalone GPU or as a core integrated within a larger SoC. The same piece of silicon, appropriately software-configured and housed in various pincount packages, can support varying amounts (and types, and speeds) of local frame buffer memory, for example. And both AMD and nVidia make a lot of incremental money on HPC (high performance computing)- and professional graphics-tailored chips that "under the hood" are exactly the same as consumer GPU equivalents, albeit with additional features (along with API and application compatibility modes) enabled and fully tested.

Software is equally configurable via...err...software. I'm not a "gamer", for example, but I'm anecdotally aware of any number of both computer and console games (both downloadable and physical disc-based) that, for example, will let you customize your race car or your avatar's clothing and weaponry for a bit more "coin". And let's not forget (although I sometimes wish I could) about the abundance of operating system proliferations present, for example, in the Windows 7 suite:

  • Starter
  • Basic
  • Home
  • Professional
  • Ultimate


and similarly characteristic of various office application suite combinations. Since you can advance from one variant to another simply by buying and entering an upgrade code, all of the bits for the fullest-featured variant are obviously present from day one...just not enabled.

Maybe, I then thought, the root cause of the backlash is just that consumers aren't yet exposed to software-based differentiation at the system level. But, actually, whether they realize it or not they actually are. Two recent teardowns I've tackled, for example, showcased common hardware router foundations that became distinct products from distinct suppliers, with distinct feature sets, via firmware differentiation. The same goes for cameras; open-source hacking can transform an entry-level model into a higher-end variant by unlocking additional features and settings.

Well, then, I next surmised, maybe the hang-up over the concept is with cars in particular. But I quickly dashed that thought, too. It's generally known (at least I think) that mainstream and premium brands and models from the same supplier share common drive trains and other hardware subsystems:

  • Acura vs. Honda
  • Infiniti vs. Nissan
  • Lexus vs. Toyota, etc.


which, to be clear, makes complete sense from a high volume manufacturing cost efficiency standpoint. And this "sharing the development load" concept even applies across manufacturers; even though Daimler-Benz AG no longer owns Chrysler (therefore no longer owning Chrysler subsidiary Jeep, either), the common lineage of Jeep's Grand Cherokee and Wrangler with Mercedes-Benz's M-Class and G-Class counterparts is indisputable.

At the end of the day, I'm a bit baffled by all the uproar around Tesla's generosity to hurricane potential victims-to-be, no matter that I admit mine's the perspective of an engineer who's been (for example) involved with flash memory-upgradeable firmware since his time at Intel in the early '90s. Tesla's ability to differentiate and update its hardware via over-the-air software "pushes" is longstanding and well documented, as I've mentioned before, and the company has done nothing to obfuscate this capability.

Perhaps what's changed is just that Tesla (along with co-founder and CEO Elon Musk) is becoming increasingly known, both to shareholders and (thanks to the upcoming mainstream-priced Model 3) potential customers. To these folks, I can only offer the advice: "get used to it". Software-based capabilities in vehicles follow in the footsteps, as I've documented here, of plenty of other examples at the IC, software, and systems levels, and will undoubtedly only expand, both here and elsewhere, in the future.

Agree or disagree, readers? Sound off in the comments.


Also see:


Brian Dipert is Editor-in-Chief of the Embedded Vision Alliance, and a Senior Analyst at BDTI and Editor-in-Chief of InsideDSP, the company's online newsletter.



Loading comments...

Write a Comment

To comment please Log In