Friday, November 8, 2013

How Many Years will the “Year of SDN” last?


The valuation of nascent SDN companies is enormous given the state of the market.  One would be lead to think that the market for SDN solutions is imminent.  Is it?  While the business value proposition, beyond “F’ Cisco, has merit the roll-out of SDN solutions cannot, and will not, occur nearly as rapidly as those with vested interest would lead you to believe.

The primary customers for SDN solutions, service providers (SPs) and large enterprises are by nature risk adverse.  SPs have huge geographic disperses networks, investors and bureaucratic regulators breathing down their necks.  Enterprises worry about, among other issues, earnings per share and business continuity.

Given this environment how can SPs and enterprises rollout SDN rapidly?  Their choices are “Rip & Replace” and “Cap & Grow”.   Is the SDN value proposition so great as to justify the former?  I think not.  The question then is how fast can they cap existing investments can and grow the new SDN solution?

Questions to consider include; how many Class 5 switches have been scrapped? How long did IMS take to be fully deployed?   As I’ve stated in previous article1 SDN is not magical that it can violate innovation adoption conventions.  We know the typical SP sales cycle.  Lab evaluation, Lab trial, field trial, market trial, regional deployments.   Each of these can take 12 to 24 months. 

Let us not forget organizational issues as well.   Who’s in the lead for SDN deployment, IT or network operations?  Whose budget will pay for and support the SDN system? 

SDN is not a simple transition.  It’s not replacing one router with a new generation router and reconnecting the cables.  Thus, the question:  How long many years will the “Year of SDN” last?

Notes:


      To develop winning strategies contact me at gwhelan@greywale.com

Tuesday, November 5, 2013

Greywale Management Releases First Service Provider Energy Strategy Taxonomy


Service Provider Energy Strategy

The energy consumption of telecommunication networks is emerging as a primary concern among network $0.01 per share in net earnings.  With this in mind, energy strategy has reached the board room! 
operators.   The largest U.S. carriers each spend over $1 Billion per year on energy.   One calculation shows that a savings of just 3% would translate in to
Given the scope, variability and diversity of these networks 

Greywale Management proposes the Greywale Service Provider Energy Strategy Taxonomy® to drive future discussions, research and investments and to prevent random acts of green.  Without a clear strategy map, the industry risk high levels of ambiguity and redundancy in these efforts and delays in implementing the much needed energy management techniques. 




Equally important it will prevent “random acts of green”.   Good “green” ideas are everywhere.   Each one may even have value.  Yet, without an overriding energy strategy driven by the taxonomy, service providers will not maximize their investment and business potential.  The use of scarce corporate resources, finances
and management attention may produce an initial euphoria but will lead to long term disillusionment. Moreover, the taxonomy will ensure that these resources and efforts are spent on the right long term solution that also addresses the current needed energy savings for the business.

http://www.prlog.org/12236478-greywale-management-announces-the-first-service-provider-energy-strategy-taxonomy.html

To download the Taxonomy go to www.greywale.com

Wednesday, September 18, 2013

A Telco Energy Strategy Should Demand Zero Impact on Services

As energy strategies reach the boardroom, service provider management should insist on “zero-impact” on services.  The stakes are too high in the competitive zero-sum game they participate in.  Customer satisfaction, reduced churn and a strong brand are paramount in this environment.  By treating energy as a strategic initiative they will achieve the benefits of lower OPEX, enhance brand and more efficient end-to-end operations.  Tactical energy initiatives will not get funded if they have a perceivable adverse effect on consumer and business services.  These adverse effects could be short lived, as during installation, or long term, if, for example, latency is introduced.   Thus, their energy strategy should demand zero impact on services.

Note the emphasis on “services” instead of “network”.    It would be unreasonable to demand zero impact on the network if you are deploying a new architecture or energy aware protocol.  Yet, with IP (Internet Protocol) the impact on the network should not cause the perceivable impact on services. 

Is zero-impact unreasonable and wouldn’t “minimal impact” be a better goal?  The challenge here would be to define what “minimal” means?  Would it mean X amount of video anomalies per 30 minutes?  Why not X+1?  Would it mean Y dropped calls/tower/minute?  Why not Y+1?  Also, who defines X and Y? Would the CEO, CTO, or CMO define them?  Would international standards organizations set them? 

 Setting the goal of “Zero Impact” sends a clear message throughout the organization of what is expected.  Terms such as “sustainability” and “green” will have clearer meaning.  Green projects that make people feel good but have no financial justification will fail fast so the real winners can progress.  Therefore, telcos and service providers should demand Zero Impact on services.

Contact: Greg Whelan at gwhelan@greywale.com to discuss.

Wednesday, June 19, 2013

Energy Management: Focus of Nokia Siemens Networks

From Greywale Management Blog (note there is no "h" in whale)

Nokia Siemens Networks (NSN) announced their Technology Vision 2020 recently.  Energy management was one of six major pillars.   The six pillars are:
  1. Support up to 1000 times the capacity
  2.  Reduce latency to milliseconds
  3. Teach networks to be self-aware
  4.  Flatten total energy consumption
  5. Reinvent telcos for the cloud
  6. Personalize network experience.

The key point regarding energy is illustrated in the following chart.
  


As shown, electricity alone accounts for 15% of total OPEX.  In developing markets this can be as high as 50% with a high percentage of off-grid sites.  If energy management is ignored the cost of power will continue to rise with the expected exponential growth in traffic.  The next chart illustrates that while traffic grows exponentially, energy efficiency grows linearly.  Thus, the amount and cost of energy will rapidly increase. 



Other key facts that NSN articulated are that the RAN (Radio Access Network) accounts for 80% of energy consumption and that current installed base-stations are 50% less efficient than new ones. 
As with any energy management and energy efficiency program there is no silver bullet or one solution to solve this.  However, there are numerous solutions when taken together add up to real savings in energy and money.  This area is too large for this short post.  For now, consider four main areas to investigate
  1. Devices: Components, Moore's Law
  2. Network Architectures
  3. Network Management and Operations
  4. Marketing and Services

By focusing on energy management and energy efficiency the end results will be meaningful OPEX savings, reduced carbon footprint and an enhance brand for sustainability conscious consumers.

Please contact me if you'd like to discuss this post.  +978 992 2203  gwhelan@greywale.com

Thursday, May 30, 2013

Cleantech: Can’t Change the forces of Physics or the forces of the Market


 “Save the Planet”.  Now that’s an admiral goal.  What’s next?  “Save the solar system”?  In all seriousness, inventions and innovations that reduce energy consumption and CO2 emissions are worthy goals.  But, just as the cleantech entrepreneur needs to address the laws of Physics they need to address the laws of the Market. 

New technology adoption, in any market, must address fundamental forces to succeed.  Cleantech, like every other market faces the classic  S-Curve and Gaussian adoption curves.   Both of these models address that fact that customers have implemented the current generation of technologies and solutions.  They are familiar with them, they know how to manage them and they have paid for them. 


The cleantech entrepreneur must develop, and articulate, a solution whose value proposition is so compelling that customers will risk, yes risk, the implementation of them.  Very few, there are some, will implement a cleantech solution only to “save the planet”. 

The entrepreneur must ask three basic questions:
  1. Can my target customer make money with my innovation?
  2. Can my target customer save money with my innovation?
  3. How easily can my target customers implement my innovation?  

If the answer to both question 1 and question 2 is “NO”, then perhaps you should go back to the drawing board.   If the answer to either of them is “YES”, the answer to question 3 will determine your strategic marketing plans and your target “innovators” and “early adopters” defined the Gaussian technology market adoption curve made popular by  Geoffrey Moore.  The larger the effort to implement your solution the more compelling the value proposition must be.  

Tuesday, May 28, 2013

Untapped Service Provider Real Estate Assets


I saw the following post in Total Telecom..(See below)..  While it's an interesting real estate play,  it misses the real untapped RE asset play....Central Offices and Regional Switching Centers.

In the U.S., central offices are decades old and they are located in prime center cities across the country.  Built in the days of human operators, today they are made up of huge empty spaces.  I've seen a medium city's CO that was also a regional switching center and of the seven (7) floors of 20 foot ceilings all but two were completely empty.

So each local exchange carrier has fully depreciated assets in prime city center locations that are basically empty.  Here, we are talking $billions of hidden asset value.  One major issue is the huge amount of physical copper pairs and the Main Distribution Frame (MDF).  I've also seen a large city CO with 110,000 copper pairs.  Impressive!

These MDF and copper pairs are not going anywhere fast.  Imaging if you could.  Migrate to FTTX (Curb, building, home, access box), put the optical CO gear in the basement and backhaul to a regional data center.  Now, after some likely serious environmental clean up and building refurbishment, you have an ideal location for an office building and a GREAT REIT PLAY.

Even if the MDFs and copper pairs can't be phased out fast enough you can still collapse the existing equipment in to a small percentage of the building and refurb the remainder.  Better check the battery banks while you're at it.

Be interested in your comments..

Greg Whelan
gwhelan@greywale.com
+978 992 2203


TOTAL Telecom Article...

Monetising real estate assets could lead telcos down REIT path

By Mary Lennighan, Total Telecom
Friday 24 May 2013

Telecoms operators worldwide seen following in the footsteps of U.S.-based Cincinnati Bell and monetising their data centres to fund investment elsewhere.

Wednesday, May 22, 2013

Vision vs Roadmap: Part II


In part one we defined the difference between the vision (emotional) and roadmap (logical).  In part 2 we’ll discuss the connection between the two. 

Marketing is tasked with creating a long term vision for the product and/or for the company.  This is a valuable function of the marketing department.  This vision is often referred to as the 10,000 foot view.  In some cases, marketing visionaries take this to 100,000 feet.  Here, we refer to this as the “airplane”.


Down on the ground, sales people with quotas “sell what’s on the truck”.  For this discussion all products will be referred to as “a box”.  The box is on the truck for the sales teams to sell today.  Traditional, or tactical, marketing, engages in activity (demand generation, collateral development, et al) to assist the sales teams effort in selling “off the truck”. 



Engineering develops the next products to put on the truck.  They base their development roadmap on numerous factors such as customer demands, competitive pressures, market windows and available technologies to name a few. 

A major disconnect between marketing and sales, and therefore engineering, is the lack of connection between the vision and the roadmap; the airplane and the truck.   The consequence is all the effort to market the airplane does nothing for the sales person on the street selling boxes off the truck.

The VISION
It’s important to push the envelope when developing your vision.   Remember a vision is emotional and emotions are ethereal not concrete.   As the vision moves out in time it’s acceptable that it gets blurry or fuzzy in the later years.   A good metaphor is the hurricane map.  


Meteorologists know where the eye of the hurricane is at a given moment.  They have an idea of where it will be at points in the future.   The further out in time the less certain they are where it will be.  The same is true for your vision. 

However, if the vision is too far-fetched  or worse technically infeasible, your credibility vanishes never to be recaptured.   To prevent against this we’ll use the word “plausible”.  From Webster.com “plausible” means;

Definition of PLAUSIBLE: superficially fair, reasonable, or valuable but often specious <a plausible pretext>

To ensure the vision is plausible ask the engineers if they could develop feature X if they had the resources (time, money and people) and the prioritization to develop it.  Or, in other words, could it be in a future release at some date in the future.  If the honest answer is "yes" then it’s plausible and belongs in the vision.

By ensuring plausibility and by including engineering in the vision development you will have the credibility with the customer and the collaboration and buy-in with the technical teams. 

In keeping with my previous post of the global phenomena of Attention Deficit Disorder, I’ll end this with some pointers of what to consider when developing your vision..

1.       Focus on how your customer, and their customers, business and lives will benefit.
2.       Focus on how the world will look with the benefits of your solution.
3.       Create a compelling view of the future
4.       Illustrate how well you understand their business, their customers business and the concerns, challenges and issues of both.
5.       Addresses CxO level care-abouts
6.       Ensure it’s in a time frame of interest and reasonableness (3-5 years)

In Part III we’ll discuss the different buying decisions of the vision and the roadmap and how to link them together to sell products today and tomorrow.

Sign up to this blog to ensure you get future post....

Thursday, May 16, 2013

Powerpoint Addiction: A Symptom of Global ADD


 A picture is worth a thousand words.  That doesn’t mean you need 1000 works per slide!

Many companies are addicted to Powerpoint.  Yes, addicted.  Whether for internal or external audiences meetings revolve around passing the projector cable back and forth so “speakers” can show their slides.  How many of us have asked the question, “can I get a copy of the slides?”  Similarly, how many people will read even a two page WORD document, unless it’s a list of bullets?

As stated in a previous post, ADD (Attention Deficit Disorder) is a global phenomenon.  That’s not going to change and will likely accelerate.  Hence the need to think in term of the billboard metaphor also stated in a previous post. 

The problem is not with Powerpoint itself, it’s with the slide creators.  Powerpoint is quite powerful and overtime has eliminated the need for Photoshop for many simple functions.   There are many issues affecting the quality of a slide and of the presentation.   Time being a main issue.  To create truly amazing slides takes a lot of time to both learn the software and to create each individual slide.   A large company I worked for would outsource the actual slide creation.  This would cost upwards of $5000 per slide!  Yes per slide.   The slides were truly amazing.   Yet, the average .PPT file grew from 5Mbytes to 20Mbytes in 5 years.   

Another main issue is the purpose of the presentation is often forgotten.   The art of storytelling is becoming a lost art.  With the above noted slides being “so good” people would naturally reuse them.  Their presentation would then be made up of beautiful slides from a number of presentations.  The problem being that there was enormous amount of redundancy from slides to slide.   For example, slide 1 would make points A, B and C.   Slide 2 would make points B, C and D.  Slide 3 would make points C, D and E, etc.    Since the slides were so good and complex no one would want to alter them.   The result were long presentations that wandered making the story line hard to follow and comprehend.

How many have seen the following slide deck:

Slide 1:  Logo, title, name and date
Slide 2: Overview of company and/or presenter
Slide 3:   







If you have, you know you immediately experienced shock and awe.   Here the presenter didn't follow the basic rules of storytelling.    There was no lead in or build up to the punch line.  How many jokes are funny if you only tell the punch line?  Presenters need to understand that while you have prepared for the meeting your audience hasn't.   They've come from another meeting or another activity and need to be grounded in your discussion.  If they don’t understand or don’t know the joke they certainly won’t get the punch line.

The lemma to this problem is when the presenter spends too much time building up the punch line.  Slide after slide of “market data”, “industry trends” and other “look how smart I am” slides will quickly have your audience checking their email.   It’s true to assume that your audience doesn't know the subject as well as you do, but don’t assume they’re complete moron’s either. 

Two last points for this post.  One, does anyone actually care about the number of slides?  Some slides may only be on the screen for 10 seconds as a segue or for re-grounding.  Some slide may be on the screen for 10 minutes to illustrate (simply I hope) a complex concept.   It’s about telling a story.

Two, how often do you linearly deliver a slide show from start to finish in a lecture mode.  Yes, there are time such as when giving an actual lecture or presenting at a conference.   However, many times your slides are there to stimulate a conversation.  A really good slide regardless of the visual quality could be one that you leave on the screen for 30 minutes and use as a reference.   Presenters, learn to "zig and zag".  Know your slides well enough where you can jump back and forward to keep the conversation flowing.  Combine that with good meeting management skills and you will rock.  

Give me a shout if you'd like to discuss.

Wednesday, May 15, 2013

Solution Marketing -- Ensuring 1+1 > 2 (Part II)


In part one the solution model was introduced.  In part two the basic fundamentals are discussed.  The focus here on "solution" is when a company is trying to leverage more than one product to create a sustainable competitive advantage against companies with either one of the products or both of the products. 

For discussion purposed the model for a product is defined in the figure below.


H (x) is the transfer function of the product.  I(x) and O(x) are the inputs and outputs of the product.  The inputs are acted upon by the transfer function to produce the outputs.   M(x) is the management interface to the product.

It is imperative that you never underestimate the value of M.  Management or the larger ongoing operations of a product is a large continuous expense (OPEX).   It’s a dominant part of the total cost of ownership metric that is widely discussed.   Operations are embedded in the organization.  The personnel responsible for the management may not even be the people who use the actual product.  Product managers and solution managers must be cognizant of how the customer uses the product, deploys the product and manages the product.  


The solution model below is comprised of two products P1 and P2.  

The goal is to create and market a "solution" S1.  The follow is an introduction to the process. 

FIRST PASS
1.       Do not [start] create a solution where S1 < P1 + P2
a.       In other words, if you are creating a solution make sure you nail 100 % of each product in the first generation.
                                                               i.      Even table stakes
b.      Customer expectations and knowledge are based on the entire product details of P1 and P2. 

2.       Do not insert P3 between P1 and P2

3.       Ensure M’ =or > M1 + M2

SECOND PASS

Now the real challenge, real value and real competitive advantage arise.   Once the transfer functions H1 and H2 are fully understood the next step is to optimize and reduce the sum of them.   H2’(x) < H1(x) + H2(x).   The feature set and functionality is only reduced if it is determined that there are functions that are not required.  

For example:

H1(x) = A + B = C and H2(x) = C x D = E [Output 2]

Then H2(x) = (A+B) x D = E

The interim value of C does not need to be calculated and acted upon.   This overly simple example illustrates how the combination of two transfer functions can be reduced to add value.  Some higher tech examples include:

1.       Less die space on a silicon chip
2.       Faster execution of software functions

ROADMAP

1.       Since the interface of O1 and I2 are embedded in the solution they need not comply with standards.
a.       Over time you can optimize this interface since it’s internal to the solution.

2.       M’ can also be evolved to optimize M1+M2 and to add solution-centric enhancements.


    This post begins to articulate how companies can gain a sustainable competitive advantage by creating a real solution.

Contact me if you'd like to learn more.

Greg Whelan
gwhelan@verizon.net
+978 992 2203





Monday, May 13, 2013

Messaging - The Billboard Metaphor


Today, marketing is dominated by social media. If you sell to consumers you must have a Facebook presence. If you market to businesses you must have a LinkedIN presence. Everyone is on Twitter. How does the marketer get their messages across in this media world? The old 30 second TV ad is too long these days. No one reads 12 page whitepapers any more, even if you now call them "e-books". No one is fooled. With ADD (attention deficit disorder) being a global phenomena what's the correct way to think about your messages.

The answer is the BILLBOARD.

“I would have written you a shorter letter if I had more time”

This quote in various versions has been attributed to Twain, Cicero and Voltaire.  It doesn’t matter who actually said it first. What matters is it defines the role of marketeers today.  

To get your message across to overwhelmed consumers requires marketeers to sharpen their pencils and spend the time to write the shortest letter possible.  These “letters”, whether documents, presentations or videos, need to get the key message(s) across quickly.  The apt metaphor is the billboard.   


Billboard marketeers need to get their message across to drivers in less than a second.  Think of driving down the highway at 65 mph and glancing at a billboard.  You don’t have time to read lines of text.  You have time to grasp an image and a key phrase.  That’s why billboard marketing is the ultimate form of messaging.  Creators must boil the value proposition over and over again until they have it terse to the nth degree.  Figure 1 shows some examples that illustrate this point.

c

Figure 1
Examples of Effective Billboards
Source: Google Images

These examples are effective since the leave the view with a clear message of the value of the product.    Looking at the first on, we see a tasty hot dog and are asked whether we love dogs.  If we do, we leave with the action to think Pink’s.  The second example hits the viewer with a simple message:  Natural, a pineapple and Skyy Vodka.  


Call  978 992 2203 to discuss how billboard marketing can work for you

Thursday, May 2, 2013

Energy Management in IP and Mobile Networks


Energy Management in IP and mobile networks is a nascent marketplace.  Why focus on this area?  First, the Internet is expected to consume 4% of the world’s electricity up from 2%.  Reductions in this area can have tremendous economic and environmental benefits.  Currently IP Traffic growth is exponentially outpacing energy efficiency in both fixed and mobile networks.  A small percentage of energy savings translates into $Billions in energy cost.  Savings here, as in any OPEX, results in cash delivered to the bottom line.


Second, sustainability for service providers and large companies is moving beyond saving money to becoming a strategic competitive advantage.  Consumers demand “green” and sustainability enhances the brand.  With a “bit” being a “bit” and a “packet” being a “packet” the brand image is critical to capture and retain customers.  Additionally, the global financial markets now link sustainability to management sophistication.

This is an emerging area which will be covered in depth in the new sister publication:  Greywhale Research.  Contact me for more information.  gwhelan@verizon.net 

Monday, April 22, 2013

Mobile Banner Ads: The Ultimate Billboard!


I use the “billboard” metaphor to address strategic messaging in an ADD-driven world. (See previous post: Social Media Marketing- The Billboard Metaphor: November 19, 2012).  The concept is to understand your value proposition and your customers so well you can effectively get your message across in less than a second.  Mobile banner  ads pose an additional challenge.  Traditional billboards you glance at while you drive by. They are there and you accept them for what they are.  

Mobile ads cause you angst and you want them to go away as fast as they can.  This creates an extreme challenge to the marketer.  Your message must be honed to a compelling image and three compelling words to get the annoyed viewer to first look at it and then to respond. 

Mobile Banner Ads are truly a case where you need to spend the time to write a “short letter”…. A wicked short letter!

Wednesday, April 17, 2013

Semiconductor Companies can Transform with the End of EOL


While it makes economic sense to End-of-Life devices it’s not without direct and indirect cost to the semiconductor manufacturer or to their customers.   The solution to this is to end EOL and move to a Mid-Life Transition.

Large semiconductor companies (semicos) are challenging businesses in many dimensions.  In many ways they lead the global innovation engine.   To the marketing strategist the cool part of the industry is understanding the market dynamics and trends to determine what device will be needed in two years.  (The median time to fully bake a complex device).   To others the cool part of the industry in managing the entire complex supply chain.    From wafers, to packaging to test to warehousing to shipping and all the steps in between is a daunting task. Multiply this by the number of devices a manufacturer sells, perhaps many thousands, and it’s clear this is truly a daunting endeavor, One I personally am glad there are armies of professionals to deal with so I can focus on what device is needed in two years.

Given this daunting endeavor it must be challenging  to determine when to end-of-life (EOL) a specific part.  Not everything has a life cycle of less than a year like cell phones.  The semico’s goal is to maximize the ROI of each device.  Yet will circuit designers hesitate to specify a part if they think it’s been around for a while and may be near its EOL?  Redesigning a printed circuit board is also a costly process.  

Yet semicos’ have a finite capacity along the entire supply chain.  Given the nature of the manufacturing process making small volumes is not an option.  Publically traded companies also have to concern themselves with the inventory line on their balance sheet.  Historically, semico’s move parts to the EOL phase of the life cycle.  This “final” phase of a device’s life may take years to achieve.  The supply chain, as noted, is a well hone global process.  EOL devices create an exception.  No longer can they be part of this traditional process.  A special supply chain process is created for them.  Exceptions to any process create cost.  Plus, catalogs and marketing materials, both paper and on-line, need to be amended or removed, sales teams need to be trained, and most importantly customers must be notified that the device is being phased out.  What’s old to you may be new to them.

In the simplest case where there’s a pin-for-pin replacement the customer is still impacted.  Their supply chain needs to be modified as well.  Engineering documents must be changed, bill-of materials updated, and manufacturing schedules altered.  It’s more costly when the “new” device isn’t a pin-for-pin replacement. 
The solution to this problem is to “End EOL” and to change the paradigm to one of a Mid-Life Transition.  Companies such as Rochester Electronics (Newburyport, MA) can be integrated early in the process to become strategic extensions of the supply chain.  These companies manage this transition to provide this seamless transition.

 Any transition will be more successful if there is continuity during the process.  To assure this during the Mid-Life Transition semico’s should make Rochester, et al, an integral part of their supply chain.  No longer is EOL an abrupt event, rather it become a seamless transitions.  No longer will EOL be a troublesome exception to the well-honed supply chain.  Rather, it will be another well-hone path. 

The End of EOL and the transition to Mid-Life Transition benefits everyone involved in the industry.  No longer are disruptive and costly events incurred by the semico or by their customer.  No longer are your devices and intellectual property being traded around the world in the proverbial dark alleys like illicit substances.   The end of EOL is a transformational event for the industry. 

Tuesday, March 5, 2013

First to Market Paradox - Part I


Business schools teach that being first to market is a desirable strategy.   The ability to be first to market occurs when there is a disruptive technology or a new emerging market requirement.  When either occurs incumbent manufacturers are typically slow to address the new opportunities.  The reasons for this are numerous and include the management of day-to-day ongoing operations, an investment in the previous generation of technologies and the initial small market size.  New entrants to the emerging market have an initial advantage for the mirror image reasons.  Namely, they are not dealing with day-to-day issues, they do not have an investment in the current generation of the technologies and the initial market size can justify the investment to address it.   However, the high tech environment is littered with companies that have adopted this strategy.  This article will highlight the primary, and in some cases counter intuitive, reasons why this occurs.
The article will not address the situation where a company creates a new market where there are no comparable options.   Also, I am using the term “product” to describe a unit that an end user, enterprise or consumer, acquires.  It could be a hardware product such as networking device or other IT device, or a software solution such as a CRM solution or source code management system.  While applicable, in some cases to consumer products, the primary focus in on business to business, B2B, marketing.

The first to market strategy is appealing on the surface for a range of solid reasons.  First, new entrants can react fast to new and emerging market requirements.  They can start with the proverbial blank piece of paper and develop a product that addresses these newer requirements.   These products can also be developed with the latest generation of merchant, or off-the-shelf silicon, and/or latest software innovations.   This should provide a cost and/or performance advantage.

In the B2B area, whether service provider or enterprise, the sales cycles can range from 6 months to 36 months depending on the complexity of the solution.  In each case, the existing production infrastructure is producing revenues either directly as in the case of the SP, or indirectly as in the case of the enterprise.   A typical deployment cycle starts with a lab test, followed by a field test or pilot program, followed by a limited rollout, the finally to full scale adoption and deployment.   

Being first to the lab trials gives the vendor advantages.  First, they establish personal relationships with the key personnel involved.  Second, they learn firsthand what the desired requirements are for both their product and for the larger system and they can influence both.  Third, they are well suited to be the first vendor in the next phase of testing.  However, that said, being first does not mean being the finalist.

The risks of being first to market are significant.  First, incumbent vendors already have relationships throughout the large customer organization.  They are in the system.  Meaning they have a history of delivering products, supporting products and getting paid for products.  Second, with these relationships they are learning of the new products desired requirements at the same time as the new entrant.   Since they do not have a first generation product, they can begin developing their product based on the real requirements and on what the customer has learned about the first generation product, both the good and the bad.

In Part II, we’ll discuss the adverse effects the first to market company faces that leads to failure.


Sunday, February 17, 2013

Acquisitions: Don’t Forget the Vision and the Messaging!


Acquisitions in the tech industries continue unabated.  Some companies are frequent buyers and others occasional buyers.  The reasons for buying are well understood and most deals make sense on the surface.  The integration of companies being acquired is not without risk and often gets down to retaining key personal.   After the big issues are resolved; financial details, the IT system combined and the organizations merged, life goes on.   Two important areas that are often overlooked is the integration of the vision and the strategic messaging.



The vision and messaging are indeed strategic. They differentiate you from your competition, they help your sales teams, they strengthen your brand, and they are the seeds for dozens of tactical activities both internally and externally.  If this is true, which it is, why aren't they front and center during the entire acquisition process?

Questions such as these needs to be addressed during this entire process:

1.       What’s the new strategic vision and message of the combined entity?
2.       Why are the buyer’s vision and messaging more compelling?
3.       How does the buyer's new vision drive an enhanced roadmap?

By address these key issues early and throughout the acquisition process the entire process is streamlined.  Everyone involved including employees, customers, investors and the media will see that the acquisition “makes sense”.  When this occurs resistive and negative forces will abate and the odds of a long term successful acquisition will increase. 

Please contact me if you’d like assistance in this area.  

Saturday, February 16, 2013

What percentage of the smart phone potential have we seen?


The smart phone is extremely powerful.  You can read mail, surf the web, read Facebook and Twitter, play games, find direction from where you currently are and thousands of other things and even make phone calls from most anywhere!   And most importantly you can store it in your shirt pocket, sorry tablets. 


That said, I’m a firm believer that we have only seen the proverbial tip of the iceberg with the potential of these devices.  I’m remiss to put a percentage, even 1%, on this potential since I’m not sure where it ends.

As we saw with the web, bandwidth drives innovation.   As soon as broadband replaced the dial up modem, which had an absolute limit of 64 Kbps, this innovation engine exploded.  With every increment of bandwidth innovations came to market that were not possible with the previous limit.  That’s why it’s important to ensure service providers never become a dumb pipe. 

Wireless has its bandwidth challenge limited by Shannon’s Law.  Yet we continue to see advances here as well.  We’ve seen 2G, 3G and now 4G/LTE.  5G is in the works.   Smaller cells, femto cells, pico cells and massive increases in Wi-Fi offload will all continue to increase the effective bandwidth available to each smart phone. 

We can only imagine what we and future generations of people will use their smart phone for.  

Thursday, February 14, 2013

High-end Only STB Vendor? Is this Possible?


 It is understandable that a dominant vendor such as Cisco would be skittish when it comes to slugging it out in the low end of the settop box market.  The gross margins on these devices are much lower than a CRS-1 or ASR-9K.  (OK, big understatement) Yet, is a high-end only strategy sustainable?  Let’s look at an historical example that would argue it is not. 

In the heyday of the “IBM PC” market, as it was known, there were dozens of manufactures of PC “compatibles”.  At the same time the Win-Tel (Windows and x86 Intel) franchise was moving rapidly up market eating away at the mini-computer and UNIX server markets.  The marketing messages from the mini and server vendors was as expected; Lower performance, not purpose built, not robust, etc. etc.

Yet, large system vendors such as DEC, AT&T, Unisys and Wang felt the need to offer their own manufactured PCs and Win-Tel servers to be able to offer the total system solution.  There are numerous interesting lessons from these dynamics, but let’s focus only on the PC for now. 

At this time Dell and Compaq were thriving selling the full range of PCs and were starting to compete in the server market.  These servers were priced around $80-100K.  DEC and the other began to see the low margin PCs, especially the consumer market, as a nuisance.  DEC publically stated that they were abandoning the “low end” PC market and were going to focus only on the high-end office PC and server markets.  Sounding familiar Cisco?   What happened was the Win-Tel architecture dominated the server market and cut harshly in to the UNIX server market.  Compaq and Dell were the winners.  Why?

If you looked at the hardware components of a low end consumer PC and a Win-Tel server they used many of the same components.  Same memory chips, same processor family, same disk drive family, etc.  Compaq participated in the high volume consumer PC market with volumes measured in 10’s of millions.  In this market gross margins were 5% on a good day.  However, they were buying these common components in 10 million unit quantities from the manufacturer. 

DEC on the other hand was selling these $100K servers in the 10’s of thousands and “high end office” PCs in the low 100’s of thousands.   Therefore, they were buying these common components based on 100k unit pricing.  When Compaq sold a $100K server in 10K volumes they had component pricing in the 10 million unit range.  Thus, by participating in the low end high volume market they had a tremendous cost advantage in the lower volume higher price market segment.  In the end, Compaq bought DEC arguably for the professional service business.

Will history repeat itself in the STB market?   The “low-end” and the “high-end” STBs share many common components.  The argument that STB are commodities anyway at the HW level makes sense in the short term.  The SW equivalent of Moore's Law, i.e., SW gets better with every release, will mitigate any software advantage over time.  Regardless of a perceived short term software advantage,  in the end, since STB are roughly 50% of an SP CapEx, low price will always win. Adding the move to virtualizing the STB and the high end only STB strategy is doomed to fail. 


To discuss this please contact me at gwhelan@greywale.com

For a list of other articles and blog post please see...    www.greywale.com