Laptops, why they all stay the same

Apart from the processor war of attrition between Intel and AMD, it seems every OEM has abandoned the laptop market to jump into the tablet war zone.

Every iPad[2] wannabe touts some very decent specs: As of February 2011 dual core Tegra 2/Qualcomm SnapDragon with 1Gb RAM is standard.

But for the laptops? Safe for stuffing new chips inside and changing the color of the casing, nothing new is happening. Dell tried the Adamo super-thin and ultra-sexy line some months back but that offering has been suspended. Lenovo has some neat tricks of its own but nothing mainstream enough to change the direction of the game for everyone.

How come that the same 1.3 Megapixel camera is still standard on laptops since Father Christmas started riding reindeers? Why can’t we have HD cameras as standard? What happened to keypads that glow slightly in the dark? Why can’t we have dual cameras on laptops too – front and back facing with capabilities to record HD. I don’t see them as something to carry around but with video conferencing on wifi, it looks like something nice to have. Multi-touch pad is also a way to go. Too bad that Windows 7 can’t use that trick for now.

But with everyone fixated on the tablets like ants doing laps around sugar, I might have to wait a wee bit longer

CDMA, going going gone?

The last few days have seen some rumblings in the telecoms sector in Nigeria. News (234Next | Thisday | Business Day) had it that Telkom is thinking of pulling out of Multilinks, a CDMA operator, or has even left (depends on who you ask) because its investment has gone down the flush bowl.

That wasn’t news to me because they started a loser’s game right from the word go. Defensive or “me too” investment is always a tragedy for any organization because it shows an obvious lack of strategic direction.

But that is for Multilinks.

Starcomms, the largest CDMA operator also pawned about 80% of its base stations to Swap Tech to offset the giganormous debt on their balance sheet.

These two stories don’t really mean much, businesses always run into trouble everywhere in the world, until everyone started saying CDMA business is bad in Nigeria. CDMA operators in Nigeria are having a tough time but the technology is not to blame. In North America, some of the largest operators use CDMA and are pretty successful. That GSM is the global dominant mobile standard points more to the strategy behind its introduction and less to the technology itself.

From my own experience there are some major faults with the CDMA operators in Nigeria that is hastily dragging them to their death.

  • One, they lack strategic directions. Considering that CDMA operators came before the GSM telcos, is their current debacle not surprising? Prior to the advent of GSM, the CDMA operators preferred to sell to the “elites”. Few had more than 30,000 lines on their switches. It was the age of mobile for the “big man”. Then GSM came, the flood gate was opened and CDMA has been playing catch up ever since
  • Two, the CDMA operators restricts customers only to the phones that they provide. And usually these are very dumb ugly phones sold for exorbitant prices. So nobody wants to use them and those who do don’t use them as their main contact phones.
  • Three, phones aside, CDMA operators lacked features and service offerings. No blackberry, you can hardly browse on your phone, etc. But then the phones were basic so even if you could, the phone wouldn’t help matters anyway. Check out their websites and see the very lame products and service offerings
  • Four, when the GSM pulled right beyond them, they went into overdrive and started competing on cost. That was the last nail. After that everything just spiraled out of control.

Telecom is a volume business and unless you have a pretty good idea of what to draw in the crowd, you can’t compete. When Visafone was launched, it was a collection of different networks and with innovative marketing, leapfrogged to the fore quickly.

Vodacom (50% owned by Telkom then) had a chance to play in the GSM space in Nigeria but curiously, their boards played it away and the opportunity passed. But to show how crazy things could be, 8 years later they paid $280M to acquire Multilinks in Nigeria. Of course, the owners were ready to bail out of a losing market then and were more than happy to sell. Even despite paying top dollars, their strategy for turnaround was at best, facing the wrong direction. Today, the rest is history.

Consolidation of the telecoms market is a reality in every country. Maybe this is our time to shed the losers.

Bummer by Midsummer?

Microsoft acquisition of Nokia looks like a very bad deal. Nothing good has come from Ballmer and Stephen Elop. This may not be different. I predict that Nokia would be the loser for this.

How does it feel, as the CEO of a company, when another company associates with you and everyone screams, losers? That shouldn’t be too far from the disgust Steve Ballmer must have felt skipping back to Seattle this evening.

Today, Nokia finally let the cat of the bag, announcing the hook-up with Redmond but the market reacted to the news negatively. It stripped 10% off the share price before you could wink twice. Apart from someone in my business class, everyone feels Microsoft has just supplied the nails to pin Nokia firmly into its coffin.

No doubt, Nokia lost its mojo, and this can’t be better explained than the burning platform parable made by Stephen Elop.

I wouldn’t mind to add my outcry to this but then maybe a little bit of restrain makes sense at this time. Both guys ain’t idiots but I’m not saying they are smart either. Time will tell: In the mobile world, 1 year is like 1,000 years. By midsummer, we should know if Ballmer has made Nokia a bummer.

Database in the cloud

In January this year (2010), I blogged about how nice it would be to have database in the cloud. That blog entry was inspired by what Marc Benioff was doing at Salesforce.com. Apparently, he was loads of miles ahead of my thought. Salesforce.com has released Database.com.

Database.com is a database running solely off the cloud. There is no hardware to manage, or software to install and configuration. Just create and run. From available information, basic access is free  you get 100,000 records and 50,000 transactions a month and maybe unlimited number of tables and other objects (they didn’t talk about that). With your web applications (PHP and the rest of the gang) you can connect to a database with REST and SOAP while authenticating with oAuth2 and SAML.

However, I have some unanswered questions.

  1. Apart from REST, JPA and SOAP, what happens to JDBC, ODBC, .NET and other connectivity layers? I might want to change my database while I keep my Jurassic Park applications running just the way it is. Anyway, I guess some script kiddie will come along and write another layer to slap on
  2. Some guys love to write crazy convoluted (I mean pretty bad queries that run forever) queries; is that allowed? You know one of the issues we faced with early shared internet hosting was your “web” neighbor doing crazy stuff which drags everyone down. Those were the dark ages
  3. What flavor of SQL do we write? Oracle has its dialect, same for MySQL, MS SQL (even between versions) and loads of other nutcase GPL forks.
  4. Will I get to connect my Toad (Quest is good!)?

If this model becomes successful, it could be the end of the road for the likes of Oracle, DB2 and Microsoft (Don’t even mention Sybase, they got to the end of the road years ago). Why? Because today, the portion of the total cost of ownership that goes to tuning and maintenance of database often dwarfs the cost getting the software initially. And if Google should throw its BigTable database into the fray with maybe free 1,000,000 rows and other freebies then you can size up the impact of the impending Armageddon. And mind you, some folks are already successful on the cloud business thingy. Amazon’s EC2 is the king of the park. Google apps has even signed up US government. Seems the cloud is here to stay.

Ok, maybe I am getting dramatic but sincerely, can you remember the days when Hotmail and Yahoo used to give just a measly 2mb as email and when floppy disks were the king of data back up? Think again.

Meanwhile, we have to wait till 2011 to try Database.com out and considering that it is just some 24 days away, it isn’t a long time to wait, is it?

Time to upgrade internet protocols

In a post-9/11 world, the security of internet protocols remains critically overlooked. A 328-page U.S. report unveils the alarming threats from rogue nations, with incidents like China Telecom’s misrouted traffic and Google’s breach by Chinese hackers spotlighting the vulnerabilities. It’s high time for a fundamental overhaul, starting with the widespread adoption of HTTPS and a reevaluation of DNS and routing protocols. This piece explores the urgent need for preemptive security measures to protect not just digital infrastructure but our global security at large.

The world has changed since 9/11 but why should the internet remain the same? Apparently I’m not talking about the development of internet technologies that have grown in leaps albeit the buzz word Web 2.0 is on its way out. I am taking about the basic internet protocols.

A 328-page report was recently released in the US about threats posed by rogue nations to security of the internet. The report said last year China Telecom broadcasted a bad BGP (Border Gateway Protocol) and routed a large portion of internet traffics through its opaque network for about 18 minutes (US military and government traffic were affected). Coming to mind was also the recent infiltration of Google by Chinese hackers.

After 9/11, the concept of preemptive security took a firm root in everyone’s mind. Also, there is now a greater sense of security awareness on personal computers but what have been left behind are the base internet protocols. Starting with basic HTTP, I think there should be a firm deadline for the deprecation of this protocol. Before now, the grouse with HTTPS has been with its bandwidth utilization because it uses more than plain vanilla HTTP but that is no longer tenable as broadband is now the norm. Google made a move in the right direction by making Gmail HTTPS a default but it can do better by making all its services HTTPS only.

DNS and other routing protocols should be addressed, a terrorist can do more damage (even physically  the speculation of what the Stuxnet could do is extremely ominous!) by waging cyber-attacks than by blowing himself up.