Category Archives: Technology

The Myth of Buying Market Share

A few years after I became an analyst/consultant at the Gartner Group, I was introduced by one of the DBMS vendors to the thoughts of Geoffrey Moore, who had some original ideas about the challenges of high-tech companies in introducing their disruptive products to mainstream buyers. His book, ‘Crossing the Chasm’ (1991) quickly became a classic in technology circles (see https://en.wikipedia.org/wiki/Crossing_the_Chasm), and I adopted his ideas in evaluating and guiding the strategies of companies in my bailiwick. Some CEOs claimed to be familiar with the theories, and even to putting them into practice, but since the distinct message in the early years of the Technology Adoption Life Cycle was ‘focus’, they understandably struggled to keep their companies in line. ‘Chasm’ thinking requires a proper marketing perspective, but independent VPs of Marketing in technological start-ups are a bit of a luxury, and VPs of Sales always think of Marketing as something that supports their Sales Plan, rather than of their Sales Plan as something which realizes the Marketing Plan. Trying to close a deal to unqualified and unsuitable prospects is frequently an exciting challenge for such types.

As my career at Gartner wound down, and I considered retirement, I chose to move to a small software company in Connecticut. I was quickly brought down to earth: as a Gartner consultant, I had earlier been engaged by the company for a day’s work, at quite high fees, during which the managers attending dutifully wrote down all I said, and nodded appreciatively. When I became an employee, however, and started suggesting (as VP of Strategic Planning) to the CEO how she might want to change some of the processes (such as not having the R & D plan changed each month after the latest visit by a customer or prospect to the development facility in Florida), I was swiftly told: ‘You don’t understand how we do things around here, Tony’. That was not a good sign. So I picked up my thinking about Chasm Crossing, tried to talk my CEO out of an acquisition strategy (devised to show muscle to the Wall Street analysts, but in fact disastrous), and reflected on how financial analysts misled investors about markets. I had learned a lot from the first software CEO I worked for, back in the early 1980s, but he was another who didn’t understand the growth challenge. ‘Entrepreneurial Critical Mass’ was the term he had used to persuade his owners to invest in an acquisition strategy that was equally misguided: I had had to pick up the pieces and try to make it work.  (This gentleman was also responsible for bringing to the world the expression ‘active and passive integrity in and of itself’ to describe the first release of a new feature, which presumably meant that it worked perfectly so long as you didn’t try to use it.)   My renewed deliberations now resulted in an article, titled ‘The Myth of Buying Market Share’, which explained how completely bogus estimates of ‘market size’ misled CEOs and investors into thinking that all they had to do to be successful was to pick up a portion of a fast-growing ‘market’. I believe it was published somewhere, but I cannot recall where.

I reproduce the article here. I have not changed a word: it could benefit from some tightening up in a few places, and some fresher examples, but otherwise I would not change a thing, even though it is now sixteen years old. At the time I wrote it, I contacted Geoffrey Moore, and sent him the piece. We spoke on the phone: he was very complimentary about my ideas, and we arranged to meet for dinner in San Francisco, where I was shortly to be attending a conference. I vaguely thought that I might spend my last few years actually putting into practice some of the notions that had been most useful to me in my analyst role, and wanted to ask Moore about opportunities at the Chasm Group. So, after the day’s sessions were over, I approached him, introduced myself, and said how much I was looking forward to dinner. He was brusque – dinner was off. Obviously something better, somebody more useful, had come along. I was for a few minutes crestfallen, but then realized that I would never want to work for someone who behaved that rudely. I resigned from the software company a month later and began my retirement a bit earlier than planned. Since then I have never touched the industry again, apart from one day’s work for another small software company in New Jersey that desperately needed help, and wanted to hire me as VP of Marketing after I did a day’s consulting for them. North Carolina beckoned, and I have never regretted getting out when I did.

After receiving a fascinating observation from a reader (via Nigel Rees), I have posted an update to my piece on ‘The Enchantment’. The normal set of Commonplace items can be found here.                                                                                                                   (January 31, 2016)

2 Comments

Filed under Economics/Business, Personal, Technology

The Congenial Richard Dawkins

When I was in my early twenties, I read a book titled something like ‘Why Darwin Is Wrong’. It wasn’t a creationist text, but a popular science-based analysis. I can’t find the volume on abebooks (which doesn’t appear to list anything before 1981), but I recall quite clearly two of its major objections to Darwinian thinking, so far as the author understood it. One, that the notion of ‘The Survival of the Fittest’ (which was actually coined by Herbert Spencer to describe Darwin’s natural selection) was tautological, and thus meaningless, since what was ‘survival’ but another way of saying that an animal was ’fit’?  Two, that if the energies that contributed to survival took place after the animal had passed on its genetic material to its offspring, there would be no mechanism by which more adaptive traits would endure in the species.

I thought at the time that these points had merit, yet I was not completely discouraged from accepting that natural selection was the most plausible explanation for evolution, even though the exact mechanisms by which it occurred were still somewhat mysterious. I was, however, dismayed by another misconception, namely the way that the Theory of Evolution was frequently misrepresented as something purposeful by even the most knowledgeable of experts. I can recall the great David Attenborough, in Life on Earth, explaining certain phenomena in terms such as: “Thus, in order to survive, the bats had to develop radar.” This notion of purpose in Evolution is obviously nonsensical, and I have occasionally had to write to the Science Editor of the New York Times to point out where their journalists mistakenly ascribe this sense of an objective to adaptive changes. After all, did certain winged birds develop their flightlessness in order to make their life less hazardous? And what was the timescale according to which such adaptive changes worked? How long would it take for various initiatives to fail or succeed before the lack of ‘fitness’ wiped out the species? At the same time, as Jonathan Weiner’s The Beak of the Finch showed, describing the researches of Peter and Rosemary Grant on the Galapagos, small changes in the dimensions of finches’ beaks could rapidly take place in the light of changing climatic conditions and food supply.

Then Richard Dawkins’s Selfish Gene came along and changed everything, showing that the gene, not the individual organism (as Darwin believed) was the unit of natural selection. I have enjoyed Dawkins’s books since, although I found his first volume of autobiography, An Appetite for Wonder, rather scrappy and chippy. Now I have just finished his sequel, Brief Candle in the Dark. This is a new Dawkins. I think his PR firm must advised him not to be so offensive and controversial, because he positively oozes congeniality, and is nice about nearly everybody, and not nearly as scathing about religion as he used to be. (There must be a social meme in such superstitions that aids the survival of certain groups, a sad but unavoidable truth.) He also turns out to have almost as many friends as did Denis Healey or Lord Weidenfeld, and appears at times unbearably smug. As a curmudgeon myself, maybe I preferred the traditional Dawkins.

He has some fascinating new insights about the evolutionary process. I was interested to see what he had to say about the hot topic of epigenetics (defined in Chambers as the ‘gradual production and organisation of parts’, which is the study of how gene behavior is affected by environmental factors), and how he contrasted it with preformationist thinking (i.e. that, in essence, a homunculus was inside every human embryo). It seemed to me lately that some neo-Lamarckians, interested in promoting the notion of the passing on of acquired characteristics, have latched on to the term of ‘epigenetics’ to assist their cause. A footnote (p 402) from Dawkins is worth citing in full: “Don’t by the way be confused by the fact that the word ‘epigenetics’ has recently been hijacked as a label for a fashionable and over-hyped idea that changes in gene expression (which of course happen all the time during the course of normal embryonic development, otherwise all cells of the body would be the same) can be passed on to future generations. Such transgenerational effects may occasionally happen and it’s a quite interesting, if rather rare, phenomenon. But it’s a shame that, in the popular press, the word ‘epigenetics’ is becoming misused as though cross-generational transmission was a part of the very definition of epigenetics, rather than a rare and interesting anomaly.” Thank you, Professor. Just what I was looking for.

In one area however, I wonder whether Dawkins has got it wrong. I recall, at about the same time that I read the book on Darwin, taking in another work that pointed out how quickly scientists make analogies between the human body and whatever the current state of technology is (i.e. a pump in the 17th c., a clock in the 18th , an engine in the 19th , a computer in the 20th ). I thought that it might have been Arthur Koestler in The Ghost in the Machine, but I can find no trace of it there, and in those pre-spreadsheet days I did not keep track of every book I read. No matter: I think the point is valid. And Dawkins falls into the same easy motion. On page 382, when discussing the possible source of language, he makes the claim that ‘the human brain must possess something equivalent to recursive subroutines’ (an ability for a computer program to call itself and then return to an outer version of itself), a feature he says exists in Algol 60, but not the original IBM Fortran  language he used. Such a feature in human genes, which he calls ‘macro-mutation’ might have come about in a single mutation, and could have been responsible for the ability to create the phenomenon of language syntax. In reducing a complex organic process to a mechanical one, however, I believe Dawkins makes a categorical mistake. A computer program is only an artifact of the entity that he is describing, namely the human brain, which is a far more complex phenomenon than the strings of ones and zeroes that comprise a language compiler. His comparison is therefore merely crude reductionism.

But then Dawkins compounds his error. He goes on to write: “Computer languages either allow recursion or they don’t. There’s no such thing as half-recursion. It’s an all or nothing software trick. And once that trick has been implemented, hierarchically embedded syntax immediately becomes possible and capable of generating indefinitely extended sentences.”  First of all, if it is a design feature, it is not a trick. The trick – if there were one – would be an inherent flaw in the software where recursion did not work properly all the time – either by faulty implementation, or by a deliberate clandestine approach that made aberrant decisions based on some external circumstance or internal control data. After all, we each one of us know now about the Volkswagen Emissions Control Software, which gave false readings when the engine was being tested under laboratory conditions. Similarly, the implementation of a compiler program that claimed to allow recession could disable the function, or cause it not to work properly, depending on, for the instance, the date or time of day, the machine environment, or the particular iteration or count of the software execution.

He thus fails to distinguish between the design statement for a compiler that allows recursion, and the instantiation of that design in code. Moreover, no software is a perfect implementation, which causes the analogy inevitably to stumble. And by hinting at the notion of design in computer languages (what he signifies as the ‘trick’), Dawkins inadvertently undermines his analogy, since that notion of an architect has no role to play in evolutionary development, natural selection being an essentially haphazard process. Too many of his metaphors (for example, the arms-race, p 340; or ‘if we think of natural selection as a sculptor’, p 359) contain this notion of design at work, and thus weaken his whole argument, since the congenial atheist would assuredly deny the role of any ‘Designer’ in the process of language evolution. While many of the mechanisms by which genetic change occurs are still mysterious, that does not mean they are mystical. Following up on this theme, Dawkins later goes on to praise Chomsky’s idea of the language-learning apparatus being genetically implanted in the brain – which also strikes me as a bogus concept, since so many languages have implementations of syntax that are utterly antithetical and incompatible with other schemes. This is the weakest part of Dawkins’s theorizing.

Still, it was all a stimulating and enjoyable read, if you can put up with Dawkins continually reminding you how clever and successful he has been.

P.S. The New York Times informed me, on November 25, that the Saeed Book Bank in Islamabad, Pakistan, sells a thousand copies of Dawkins’s atheist treatise ‘The God Delusion’ each year. Not many people know that.

New Commonplace entries appear here.                                                                                                     (November 30, 2015)

4 Comments

Filed under Personal, Science, Technology

‘All The News That’s Not Fit To Archive’

We relational database people are well-organized, methodical. We like analysis and business rules, strong notions of identity , the use of sets and non-significant keys, normalized designs and value-based links, precise versioning and time-stamps, and careful promotion of systems into production, with secure fall-back procedures. All that is tech-talk, but it means something in the real world. (One of the first articles I had published, back in 1980, in Datamation, was titled ‘The Importance of Good Relations’, which showed the link between solid database design and flexible business practices.)

Yet the Web has changed all this. When I first developed my website, under Microsoft’s FrontPage, there was some semblance of a test environment and a production environment. I would develop the site on my computer, and when I was ready, and had made sure all the links were defined, and pointed to real pages, I would upload the whole kit and caboodle to the host site, where the new system would replace the old, giving me the option of importing all pages that had changed (but admittedly with no easy fall-back to the previous version). No more. I now use something called WordPress, which I invoke on a remote server. It allows me to compose and save drafts of individual pages, but it is otherwise tightly integrated with the production system. If I promote a new page, it goes live immediately, and if I change it again ten seconds later, the page is immediately replaced, with the previous one lost for ever. (Unless it found its path to some entity called the Wayback Machine, which is described in a fascinating article by Jill Lepore in the New Yorker of January 26, 2015, titled The Cobweb: Can the Internet be archived?)

I mention all this in connection with my last plaint from the January blog, about the New York Times, and its practice of making changes to its electronic versions of articles after they have been published in the printed version (or the late printed version, since that happens, too. We in North Carolina get an earlier version than the people up in New York, for example.) The reason this concerns me is primarily one of research integrity, since there is no longer a ‘paper of record’ on which historians can rely. I made this point in an email to the Public Editor, whose office eventually acknowledged my inquiry, promised to look into it, but then withdrew in silence. So, after a couple of weeks, I checked out the paper’s Statement of Standards and Ethics, and wrote to the Vice-President of Corporate Communications. The essence of my message ran as follows:

“For there is a vital question to be answered: ‘What is the paper of record?’ Your slogan on the first page of the printed edition is still ‘All The News That’s Fit To Print’, but apparently some of that news is Not Fit To Archive. What happens when historians attempt to use the paper for research purposes? Do they have to keep separate clippings files, since the electronic version is unreliable, and has been purified in some way for later consumption? Is there an active policy under way here that should affect your Ethics statement? How are decisions made to ‘improve’ the content of articles that have already appeared in the printed edition? Why are these not considered ‘Corrections’ that would normally be posted in the relevant section? How often does this happen?”

I received a prompt response, but it was all very dismissive and casual:

“The change you noticed was simply the result of normal editing, which takes place constantly for news stories, both between print editions and for successive online versions. In this case, additional information (including crowd estimates) was added to the story between the early print edition and the final print edition, which meant something had to be cut for the story to fit in the same space. In most cases, the final print version is the one that remains permanently on nytimes.com, though in some cases a story continues to be updated or revised online even after the final print edition.”

So I countered as follows:

“But I must state that I think that you (and I am not sure who ‘you’ are in this case) are being far too casual about this policy, simply treating the process as ‘normal editing’. Is there an audit trail? Do you keep all versions? What changes are allowed to be made after the final print version? Why cannot the on-line version (which has no size constraints) include all the text? Is there any period of limitation after which no further amendments can be made? How do you plan to explain this policy to readers, whose ‘trust’ you say you value so much?

I am sure you must be aware of the current debate that is being carried on in the world of academic research, where annotations to URLs in serious articles often turn out to be dead links instead of reliable sources. A Times ‘page’ no longer has a unique and durable identity, which I believe is an important issue.

I look forward to some deeper explanation of this policy in the newspaper.”

Well, maybe I should get out more. As Sylvia would suggest to me: “You clearly need something better to do.”  But I maintain that it is an important problem, not just concerning journalistic integrity, and getting the story right the first time, and not correcting quotations that the speaker wanted to withdraw (which we are told goes on).  It is more to do with what is known as ‘content drift’ and ‘reference rot’. As Jill Lepore’s article states: “. . .a 2013 survey of law- and policy-related publications found that, at the end of six years, nearly fifty per cent of the URLs cited in those publications no longer worked. According to a 2014 study conducted at Harvard Law School, ‘more than 70% of the URLs within the Harvard Law Review and other journals, and 50% of the URLs within United States Supreme Court opinions, do not link to the original cited information.” A more subtle problem is that the links may work, but the content may have changed  ̶  may have been edited, corrected, improved, revised, or sanitised. For researchers like me, this can be very annoying, as books these days frequently cite URLs rather than printed sources in their references, and when those pages do not exist, one feels cheated, and may also wonder whether they have been modified. The academic process has been debased. If one has text in the New York Times that is no longer on the archive, does it still exist? Is it still valid? Do I really have to maintain my clippings files, as opposed to an index of URLs? (To make her point, the Times Vice-President had to send me a scan of the two printed versions of the relevant page in question.)

We shall see. I haven’t received a follow-up to my second inquiry yet. Either the Times doesn’t believe it is an issue, or the managers there are having a big debate about the topic, which they don’t currently wish to share. I’ll provide an update if I do hear anything.

The normal set of Commonplace Updates this month. (February 28, 2015)

Leave a Comment

Filed under Media, Personal, Technology