Skip to content →

Tag: Moore’s Law

No Digital Skyscrapers

A colleague of mine shared an interesting article by Sarah Lacy from tech site Pando Daily about the power of technology building the next set of “digital skyscrapers” – Lacy’s term for enduring, 100-year brands in/made possible by technology. On the one hand, I wholeheartedly agree with one of the big takeaways Lacy wants the reader to walk away with: that more entrepreneurs need to strive to make a big impact on the world and not settle for quick-and-easy payouts. That is, after all, why venture capitalists exist: to fund transformative ideas.

But, the premise of the article that I fundamentally disagreed with – and in fact, the very reason I’m interested in technology is that the ability to make transformative ideas means that I don’t think its possible to make “100-year digital skyscrapers”.

In fact, I genuinely hope its not possible. Frankly, if I felt it were, I wouldn’t be in technology, and certainly not in venture capital. To me, technology is exciting and disruptive because you can’t create long-standing skyscrapers. Sure, IBM and Intel have been around a while — but what they as companies do, what their brands mean, and their relative positions in the industry have radically changed. I just don’t believe the products we will care about or the companies we think are shaping the future ten years from now will be the same as the ones we are talking about today, nor were they the ones we talked about ten years ago, and they won’t be the same as the ones we talk about twenty years from now. I’ve done the 10 year comparison before to illustrate the rapid pace of Moore’s Law, but just to be illustrative again: remember, 10 years ago:

  • the iPhone (and Android) did not exist
  • Facebook did not exist (Zuckerberg had just started at Harvard)
  • Amazon had yet to make a single cent of profit
  • Intel thought Itanium was its future (something its basically given up on now)
  • Yahoo had just launched a dialup internet service (seriously)
  • The Human Genome Project had yet to be completed
  • Illumina (posterchild for next-generation DNA sequencing today) had just launched its first system product

And, you know what, I bet 10 years from now, I’ll be able to make a similar list. Technology is a brutal industry and it succeeds by continuously making itself obsolete. It’s why its exciting, and it’s why I don’t think and, in fact, I hope that no long-lasting digital skyscrapers emerge.

One Comment

Forget Moore’s Law, its all about Crayola’s Law

As a tech aficionado, its clear that I’m a fan of Moore’s Law (the observation that technology seems to double in capacity every 2-3 years). But, as my friend Elizabeth pointed out, Crayola’s Law (as coined by economics blog Marginal Revolutions) is far more visually awesome:

image

(Image taken from Marginal Revolution, original source Weather Unsealed)

Leave a Comment

Decade of Moore’s Law

image

I’ve mentioned Moore’s Law in passing a few times before. While many in the technology industry see the concept only on its most direct level – that of semiconductor scaling (the ability of the semiconductor industry, so far, to double transistor density every two or so years) – I believe this fails to capture its true essence. It’s not so much a law pertaining to a specific technology (which will eventually run out of steam when it hits a fundamental physical limit), but an “economic law” about an industry’s learning curve and R&D cycle relative to cost per “feature”.

Almost all industries experience a learning curve of some sort. Take the automotive industry – despite all of its inefficiencies, the cost of driving one mile has declined over the years because of improvements in engine technology, the building of the parts, general manufacturing efficiency, and supply chain management – but very few have a learning curve which operates on the same speed (how rapidly an industry improves its economic performance) and steepness (how much efficiency improves given a certain amount of “industry experience”) as the technology industry which can rely not only on learning curves but disruptive technological changes.

One of the best illustrations I’ve seen of this is a recent post on MacStories comparing a 2000 iMac and Apple’s new iPhone 4:

2000 iMac 2010 iPhone 4
Processor 500 MHz PowerPC G3 CPU 1 Ghz ARM A4 CPU
RAM 128MB 512MB
Graphics ATI Rage 128 Pro                             (8 million triangles) PowerVR SGX 535              (28 million triangles)
Storage 30GB Hard Drive 32GB NAND Flash
Weight 34.7 pounds 4.8 ounces

Although the comparisons are not necessarily apples-to-apples, they give a sense of the speed at which Moore’s Law progresses. Amazing, no?

(Image credit)

Leave a Comment

Look ma, no battery!

While Moore’s Law may make it harder to be a tech company, it’s steady march makes it great to be an energy-conscious consumer, as one of its effects is to drive down power consumption in generation after generation of product. Take the example of smartphones like Apple’s iPhone or Motorola’s new Droid: Moore’s Law has made it possible to take computing power that used to need a large battery or power source (like in a laptop or a desktop) and put it in a mobile device that has a tiny rechargeable battery!

imageSome folks at NEC and Soundpower took advantage of this in a very cool way (HT: TechOn via Anthony). By combining NEC’s specialty in extremely low-power chips with Soundpower’s expertise at creating vibration-based power generators, the two companies were able to produce a battery-less remote control powered only by users pressing the buttons!

It makes me wonder where else this type of extremely low-power circuitry and simple energy generation setup could be useful: sensor networks? watches? LEDs? personal-area-networks?

And at the end of the day, that’s one of the things that makes the technology industry so interesting (and challenging to understand). Every new device could enable/develop a whole new set of applications and uses.

(Image credit)

Leave a Comment

HotChips 101

image This post is almost a week overdue thanks to a hectic work week. In any event, I spent last Monday and Tuesday immersed in the high performance chip world at the 2009 HotChips conference.

Now, full disclosure: I am not electrical engineer, nor was I even formally trained in computer science. At best, I can “understand” a technical presentation in a manner akin to how my high school biology teacher explained his “understanding” of the Chinese language: “I know enough to get in trouble.”
But despite all of that, I was given a rare look at a world that few non-engineers ever get to see, and yet it is one which has a dramatic impact on the technology sector given the importance of these cutting-edge chip technologies in computers, mobile phones, and consumer electronics.

And, here’s my business strategy/non-expert enthusiast view of six of the big highlights I took away from the conference and which best inform technology strategy:

  1. image We are 5-10 years behind on the software development technology needed to truly get performance power out of our new chips. Over the last decade, computer chip companies discovered that simply ramping up clock speeds (the Megahertz/Gigahertz number that everyone talks about when describing how fast a chip is) was not going to cut it as a way of improving computer performance (because of power consumption and heat issues). As a result, instead of making the cores (the processing engines) on a chip faster, chip companies like Intel resorted to adding more cores to each chip. The problem with this approach is that performance becomes highly dependent on software developers being able to create software which can figure out how to separate tasks across multiple cores and share resources effectively between them – something which is “one of the hardest if not the hardest systems challenge that we as an industry have ever face” (courtesy of UC Berkeley professor Dave Patterson). The result? Chip designers like Intel may innovate to the moon, but unless software techniques catch up, we won’t get to see any of that. Is it no wonder, then, that Intel bought multi-core software technology company RapidMind or that other chip designers like IBM and Sun are so heavily committed to creating software products to help developers make use of their chips? (Note: the image to the right is an Apple ad of an Intel bunny suit smoked by the PowerPC chip technology that they used to use)
  2. Computer performance may become more dependent on chip accelerator technologies. The traditional performance “engine” of a computer was the CPU, a product which has made the likes of Intel and IBM fabulously wealthy. But, the CPU is a general-purpose “engine” – a jack of all trades, but a master of none. In response to this, companies like NVIDIA, led by HotChips keynote speaker Jen-Hsun Huang, have begun pushing graphics chips (GPUs), traditionally used for gaming or editing movies, as specialized engines for computing power. I’ve discussed this a number of times over at the Bench Press blog, but the basic idea is that instead of using the jack-of-all-trades-and-master-of-none CPU, a system should use specialized chips to address specialized needs. Because a lot of computing power is burnt doing work that is heavy on the mathematical tasks that a GPU is suited to do, or the signal processing work that a digital signal processor might be better at, or the cryptography work that a cryptography accelerator is better suited for, this opens the doorway to the use of other chip technologies in our computers. NVIDIA’s GPU solution is one of the most mature, as they’ve spent a number of years developing a solution they call CUDA, but there was definitely a clear message: as the performance that we care about becomes more and more specialized (like graphics or number crunching or security), special chip accelerators will become more and more important.
    image
  3. Designing high-speed chips is now less and less about “chip speed” and more and more about memory and input/output. An interesting blog post by Gustavo Duarte highlighted something very fascinating to me: your CPU spends most of its time waiting for things to do. So much time, in fact, that the best way to speed up your chip is not to speed up your processing engine, but to speed up getting tasks into your chip’s processing cores. The biological analogy to this is something called a perfect enzyme – an enzyme that works so fast that its speed is limited by how quickly it can get ahold of things to work on. As a result, every chip presentation spent ~2/3 of the time talking about managing memory (where the chip stores the instructions it will work on) and managing how quickly instructions from the outside (like from your keyboard) get to the chip’s processing cores. In fact, one of the IBM POWER7 presentations spent almost the entire time discussing the POWER7’s use and management of embedded DRAM technology to speed up how quickly tasks can get to the processing cores.
  4. Moore’s Law may no longer be as generous as it used to be. I mentioned before that one of the big “facts of life” in the technology space is the ability of the next product to be cheaper, faster, and better than the last – something I attributed to Moore’s Law (an observation that chip technology doubles in capability every ~2 years). At HotChips, there was a imagefascinating panel discussing the future of Moore’s Law, mainly asking the question of (a) will Moore’s Law continue to deliver benefits and (b) what happens if it stops? The answers were not very uplifting. While there was a wide range of opinions on how much we’d be able to squeeze out of Moore’s Law going forward, there was broad consensus that the days of just letting Moore’s Law lower your costs, reduce your energy bill, and increase your performance simultaneously were over. The amount of money it costs to design next-generation chips has grown exponentially (one panelist cited a cost of $60 million just to start a new custom project), and the amount of money it costs to operate a semiconductor factory have skyrocketed into the billions. And, as one panelist put it, constantly riding the Moore’s Law technology wave has forced the industry to rely on “tricks” which reduced the delivery of all the benefits that Moore’s Law was typically able to bring about. The panelists warned that future chip innovations were going to be driven more and more by design and software rather than blindly following Moore’s Law and that unless new ways to develop chips emerged, the chip industry itself could find itself slowing its progress.
  5. Power management is top of mind. The second keynote speaker, EA Chief Creative Officer Richard Hilleman noted something which gave me significant pause. He said that in 2009, China will probably produce more electric cars in one year than have ever been produced in all of history. The impact to the electronics industry? It will soon be very hard to find and imagevery expensive to buy batteries. This, coupled with the desires of consumers everywhere to have longer battery lives for their computers, phones, and devices means that managing power consumption is critical for chip designers. In each presentation I watched, I saw the designers roll out a number of power management techniques – the most amusing of which was employed by IBM’s new POWER7 uber-chip. The POWER7 could implement four different low-power modes (so that the system could tune its power consumption), which were humorously named: doze, nap, sleep, and “Rip van Winkle”.
  6. Chip designers can no longer just build “the latest and greatest”. There used to be one playbook in the Silicon Valley – build what you did a year ago, but make it faster. That playbook is fast becoming irrelevant. No longer can Silicon Valley just count on people to buy bigger and faster computers to run the latest and greatest applications. Instead, people are choosing to buy cheaper computers to run Facebook and Gmail, which, while interesting and useful, no longer need the CPU or monitor with the greatest “digital horsepower.” EA’s Richard Hilleman noted that this trend was especially important in the gaming indimageustry. Where before, the gaming industry focused on hardcore gamers who spent hours and hours building their systems and playing immersive games, today, the industry is keen on building games with clever mechanics (e.g. a Guitar Hero or a game for the Nintendo Wii) for people with short attention spans who aren’t willing to spend hours holed up in front of their televisions. Instead of focusing on pure graphical horsepower, gaming companies today want to build games which can be social experiences (like World of Warcraft) or which can be played across many devices (like smartphones or over social networks). With stores like Gamestop on the rise, gaming companies can no longer count on just selling games, they need to think up how to sell “virtual goods” (like upgrades to your character/weapons) or in-game advertising (a Coke billboard in your game?) or encourage users to subscribe. What this all means is that, to stay relevant, technology companies can no longer just gamble on their ability to make yesterday’s product faster, they have to make them better too.

There was a lot more that happened at HotChips than I can describe here (and I skipped over a lot of the more techy details), but those were six of the most interesting messages that I left the conference with, and I am wondering if I can get my firm to pay for another trip next year!

Oh, and just to brag, while at HotChips, I got to check out a demo of the potential blockbuster game Batman: Arkham Asylum while checking out NVIDIA’s 3D Vision product! And I have to say, I’m very impressed by both products – and am now very tempted by NVIDIA’s Buy a GeForce card, get Batman: Arkham Asylum free offer.

(Image credit: Intel bunny smoked ad) (Image credit: GPU computing power) (Image Credit: brick wall) (Image – Rip Van Winkle) (Image – World of Warcraft box art)

One Comment

Tech strategy 101

imageWorking on tech strategy for 18 months ingrains a thing or two in your head about strategy for tech companies, so I thought I’d lay out, in one blog post (which may itself turn into a series) the major lessons I’ve learned about how strategy in the technology sector works.

To understand that, it’s important to first understand what makes technology special? From that perspective, there are three main things which drive tech strategy:

  1. Low cost of innovation – Technology companies need to be innovative to be successful, duh. But, the challenge with handling tech strategy is not innovation but that innovation in technology is cheap. Your product can be as easily outdone by a giant with billions of dollars like Google as it can be outdone by a couple of bright guys in a garage who still live with their parents.
  2. Moore’s Law – When most technologists think of Moore’s Law, they think of its academic consequences (mainly that chip technology doubles every two years). This is true (and has been for over 50 years), but the strategic consequence of Moore’s Law can be summed up in six words: “Tomorrow will be better, faster, cheaper.” Can you think of any other industry which has so quickly and consistently increased quality while lowering cost?
  3. Ecosystem linkages – No technology company stands alone. They are all inter-related and inter-dependent. Facebook may be a giant in the Web world, but it’s success depends on a wide range of relationships: it depends on browser makers adhering to web standards, on Facebook application developers wanting to use the Facebook platform, on hardware providers selling the right hardware to let Facebook handle the millions of users who want to use it, on CDNs/telecom companies providing the right level of network connectivity, on internet advertising standards, etc. This complex web of relationships is referred to by many in the industry as the ecosystem. A technology company must learn to understand and shape its ecosystem in order to succeed.

Put it all together, what does it all mean? Four things:

I. Only the paranoid survive
image This phrase, popularized by ex-Intel CEO Andy Grove, is very apt for describing the tech industry. The low cost of innovation means that your competition could come from anywhere: well-established companies, medium-sized companies, hot new startups, enterprising university students, or a legion of open source developers. The importance of ecosystem linkages means that your profitability is dependent not only on what’s going on with your competitors, but also about the broader ecosystem. If you’re Microsoft, you don’t only have to think about what competitors like Apple and Linux are doing, you also need to think about the health of the overall PC market, about how to connect your software to new smartphones, and many other ecosystem concerns which affect your profitability. And the power of Moore’s Law means that new products need to be rolled out quickly, as old products rapidly turn into antiques from the advance of technology. The result of all of this is that only the technology companies which are constantly fearful of emerging threats will succeed.

II. To win big, you need to change the rules
The need to be constantly innovative (Moore’s Law and low cost of innovation) and the importance of ecosystem linkages favors large, incumbent companies, because they have the resources/manpower to invest in marketing, support, and R&D and they are the ones with the existing ecosystem relationships. As a result, the only way for a little startup to win big, or for a large company to attack another large company is to change the rules of competition. For Apple, to win in a smartphone market dominated by Nokia and RIM required changing the rules of the “traditional” smartphone competition by:

  • image Building a new type of user-interface driven by accelerometer and touchscreen unlike anything seen before
  • Designing in a smartphone web browser actually comparable to what you’d expect on a PC as opposed to a pale imitation
  • Building an application store to help establish a new definition of smartphone – one that runs a wide range of software rather than one that runs only software from the carrier/phone manufacturer
  • Bringing the competition back to Apple’s home turf of making complete hardware and software solutions which tie together well, rather than just competing on one or the other

Apple’s iPhone not only provided a tidy profit for Apple, it completely took RIM, which had been betting on taking its enterprise features into the consumer smartphone market, and Nokia, which had been betting on its services strategy, by surprise. Now, Nokia and every other phone manufacturer is desperately trying to compete in a game designed by Apple – no wonder Nokia recently forecasted that it expected its market share to continue to drop.
But it’s not just Apple that does this. Some large companies like Microsoft and Cisco are masters at this game, routinely disrupting new markets with products and services which tie back to their other product offerings – forcing incumbents to compete not only with a new product, but with an entire “platform”. Small up-and-comers can also play this game. MySQL is a great example of a startup which turned the database market on its head by providing access to its software and source code for free (to encourage adoption) in return for a chance to sell services.

III. Be a good ecosystem citizen
Successful tech companies cannot solely focus on their specific markets and product lines. The importance of ecosystem linkages forces tech companies to look outward.

  • image They must influence industry standards, oftentimes working with their competitors (case in point: look at the corporate membership in the Khronos Group which controls the OpenGL graphics standard), to make sure their products are supported by the broader industry.
  • They oftentimes have to give away technology and services for free to encourage the ecosystem to work with them. Even mighty Microsoft, who’s CEO had once called Linux “a cancer”, has had to open source 20,000 lines of operating system code in an attempt to increase the attractiveness of the Microsoft server platform to Linux technology. Is anyone surprised that Google and Nokia have open sourced the software for their Android and Symbian mobile phone operating systems and have gone to great lengths to make it easy for software developers to design software for them?
  • They have to work collaboratively with a wide range of partners and providers. Intel and Microsoft work actively with PC manufacturers to help with marketing and product targeting. Mobile phone chip manufacturers invest millions in helping mobile phone makers and mobile software developers build phones with their chip technology. Even “simple” activities like outsourcing manufacturing requires a strong partnership in order to get things done properly.
  • The largest of companies (e.g. Cisco, Intel, Qualcomm, etc) takes this whole influence thing a whole step further by creating corporate venture groups to invest in startups, oftentimes for the purpose of influencing the ecosystem in their favor.

The technology company that chooses not to play nice with the rest of the ecosystem will rapidly find itself alone and unprofitable.

IV. Never stop overachieving
There are many ways to screw up in the technology industry. You might not be paranoid enough and watch as a new competitor or Moore’s Law eats away at your profits. You might not present a compelling enough product and watch as your partners and the industry as a whole shuns your product. But the terrifying thing is that this is true regardless of how well you were doing a few months ago — it could just as easily happen to a market leader as a market follower (i.e. Polaroid watching its profits disappear when digital cameras entered the scene).
As a result, it’s important for every technology company to keep their eye on the ball in two key areas, so as to reduce the chance of misstep and increase the chance that you recover when you eventually do:

  • Stay lean – I am amazed at how many observers of the technology industry (most often the marketing types) seem to think that things like keeping costs low, setting up a good IT system, and maintaining a nimble yet deliberate decision process are unimportant as long as you have an innovative design or technology. This is very short-sighted especially when you consider how easy it is for a company to take a wrong step. Only the lean and nimble companies will survive the inevitable hard times, and, in good times, it is the lean and nimble companies which can afford to cut prices and offer more services better than their competitors.
  • Invest in innovation – At the end of the day, technology is about innovation, and the companies which consistently grow and turn a profit are the ones who invest in that. If your engineers and scientists aren’t getting the resources it needs, no amount of marketing or “business development” will save you from oblivion. And, if your engineers/scientists are cranking out top notch research and development, then even if you make a mistake, there’s a decent chance you’ll be ready to bounce right back.

Obviously, each of these four “conclusions” needs to be fleshed out further with details and concrete analyses before they can be truly called a “strategy”. But, I think they are a very useful framework for understanding how to make a tech company successful (although they don’t give any magic answers), and any exec who doesn’t understand these will eventually learn them the hard way.
(Image credit – chess) (Image credit – iphone vs blackberry) (Image credit – plant)

9 Comments
%d bloggers like this: