What do you do if you have a consumer electronics product that is universally considered to be the best in its category? It is a product that is selling exceptionally well and has some new and innovative improvements ready to be announced. Most companies, especially in these less than robust economic times, would be happy and continue to take the cash from the brisk sales to the bank. If you are Cisco, the giant computer networking products company, you discontinue the product line. That’s right; Cisco announced that the Flip video camera will no longer be manufactured.
If you are a regular follower of my column you know that from the time the Flip was released a few years ago I have been a big cheerleader for the simplicity and utility of this cigarette pack sized video camera. After reviewing the statements from the Cisco suits, I remain convinced that for the vast majority of consumers, the Flip is still the best small video camera available, albeit only for a few more months.
There have been several competitors surface in this product category, but all of them seem to fail to understand why the Flip was such a great product. Companies like Kodak, Sony, Toshiba and others retained Flip’s small size but their engineers and designers could not resist adding more features. While the Flip essentially has one button, the competitors added other features that get in the way of the point and shoot simplicity of the Flip.
Cisco, in the announcement of the cancellation of the Flip, related that it was no longer a viable product because mobile phones had added video recording capability. That may be true but a simple test proved to me that they were misinformed about the practicality of using phones to capture spur of the moment events.
I have a smart phone that has video recording capability. If I want to make a video recording I have to go through six steps, all of them embedded in on-screen menus before I am actually recording. With the Flip there are two steps. I turn it on and press the red button.
Some would accuse me of being a modern day Luddite but I am not a big supporter of the Swiss Army knife approach to all things electronic. Adding features adds complexity and often gets in the way of ease of use. In a car you should not need to look at an on- screen menu to turn on the windshield wipers or heater. The remote control for your TV should not have more buttons that the space shuttle. Guess I am showing my age.
Monday, April 25, 2011
Sunday, April 17, 2011
When Is a TV Not a TV?
Late last month you may have received information from Time Warner Cable about a new service being offered to area subscribers. The service allows individuals who have an Apple iPad to watch some TV programs carried on the Time Warner system. The company and other cable TV companies around the country are offering a new App that uses your iPad and home wifi to connect with your cable. Once connected, the iPad serves as a portable TV and can be used anywhere in the house without wires or other special connections.
The service was praised by many media pundits who applauded the seamless merger of the computer and the TV. This convergence has been the “holy grail” of proponents of bringing the Internet and traditional TV to a variety of screens large and small, fixed and portable. iPad users too were happy to get a free app that made watching TV more convenient.
Not everyone was so happy. Faster than it took to download the new App several program service providers were yelling foul. Several cable networks like Discovery, Fox and Viacom argued that their programming was licensed to Time Warner for distribution to TV sets not streamed to computers. They demanded that Time Warner remove their programming from the list of offerings for iPad viewing.
Welcome to the new TV landscape where ultimately the definition of a TV set will be a critical factor in your ability to watch your favorite shows. As you may expect the brouhaha centers around money. The program distributors are concerned that if viewers watch their programming on any device other than a traditional TV they will not be counted in the Nielsen surveys and as such the advertising revenue may suffer. Nielsen has been struggling to compile reliable information about TV viewing as more and more of us are using computers, smart phones and other digital devices to view TV programs. The proliferation of non traditional non real time viewing has begun to fragment audiences and the future only promises to further change viewing patterns.
TV Program producers are scrambling to keep from repeating the revolution experienced by music producers and distributors. They wish to hold on to their business as usual lucrative industry. My bet is that they are swimming against the tide and just as online music distribution has made CDs as out of date as rotary phones, the blurring of the differences between the computer screen and TV screen will make the traditional TV programming business a much different enterprise moving toward a pay per view model from the current advertising supported industry.
The service was praised by many media pundits who applauded the seamless merger of the computer and the TV. This convergence has been the “holy grail” of proponents of bringing the Internet and traditional TV to a variety of screens large and small, fixed and portable. iPad users too were happy to get a free app that made watching TV more convenient.
Not everyone was so happy. Faster than it took to download the new App several program service providers were yelling foul. Several cable networks like Discovery, Fox and Viacom argued that their programming was licensed to Time Warner for distribution to TV sets not streamed to computers. They demanded that Time Warner remove their programming from the list of offerings for iPad viewing.
Welcome to the new TV landscape where ultimately the definition of a TV set will be a critical factor in your ability to watch your favorite shows. As you may expect the brouhaha centers around money. The program distributors are concerned that if viewers watch their programming on any device other than a traditional TV they will not be counted in the Nielsen surveys and as such the advertising revenue may suffer. Nielsen has been struggling to compile reliable information about TV viewing as more and more of us are using computers, smart phones and other digital devices to view TV programs. The proliferation of non traditional non real time viewing has begun to fragment audiences and the future only promises to further change viewing patterns.
TV Program producers are scrambling to keep from repeating the revolution experienced by music producers and distributors. They wish to hold on to their business as usual lucrative industry. My bet is that they are swimming against the tide and just as online music distribution has made CDs as out of date as rotary phones, the blurring of the differences between the computer screen and TV screen will make the traditional TV programming business a much different enterprise moving toward a pay per view model from the current advertising supported industry.
Sunday, April 10, 2011
Faster Not Always Better
Social networking and instantaneous and continuous news reporting have become so much a part of the landscape that even the most insignificant and trivial happenings get reported worldwide as important breaking news. Be it the police chasing a DUI driver on a Hollywood freeway or a robbery of a McDonalds in Fargo, we see it live on TV and on the Internet. For sure there are important news items that we need to know in a timely manner, but the ease of worldwide instantaneous distribution has provided proof that the old adage “Engage Brain Before Opening Mouth.” is truer now than ever before.
In the past since, there was a delay from the time a story came to light and the time it was reported on air or in print, there was most often sufficient time for getting the facts. Today because of the ravenous appetite of the 24/7 news services there is pressure to release a story as quickly as possible and fill in the details later.
Just recently there was a news report circulating on the Internet that Samsung, a major electronics manufacturer, was installing on their new line of laptop computers a software program that captured and transmitted back to Samsung all of the keystrokes made on that machine by the user. This type of software has been used by hackers to gather personal information from unsuspecting users. The nefarious software usually gets into a computer that has weak or non existent virus protection. For Samsung to have been accused of installing this software in the computers they were manufacturing and selling was really news. It was really bad news for Samsung as spying on customers could be devastating and not really conducive for increasing sales or improving a corporate image.
This report made the rounds on the Internet and was picked up by many individuals and forwarded in tweets and Facebook postings. As the saying goes, “It went viral.” All of this took only minutes to circulate. Within hours Samsung released an explanation that the company had not installed this spyware and that it was a very popular virus scanning program from GFI, a company with no affiliation with Samsung, that was indicating a “false positive” warning. GFI admitted this error in short order.
In the past this story would have never been released in the first place as the explanation would have been available before the newspaper article was printed or the TV and Radio news report produced. Not so in the Internet age.
In the past since, there was a delay from the time a story came to light and the time it was reported on air or in print, there was most often sufficient time for getting the facts. Today because of the ravenous appetite of the 24/7 news services there is pressure to release a story as quickly as possible and fill in the details later.
Just recently there was a news report circulating on the Internet that Samsung, a major electronics manufacturer, was installing on their new line of laptop computers a software program that captured and transmitted back to Samsung all of the keystrokes made on that machine by the user. This type of software has been used by hackers to gather personal information from unsuspecting users. The nefarious software usually gets into a computer that has weak or non existent virus protection. For Samsung to have been accused of installing this software in the computers they were manufacturing and selling was really news. It was really bad news for Samsung as spying on customers could be devastating and not really conducive for increasing sales or improving a corporate image.
This report made the rounds on the Internet and was picked up by many individuals and forwarded in tweets and Facebook postings. As the saying goes, “It went viral.” All of this took only minutes to circulate. Within hours Samsung released an explanation that the company had not installed this spyware and that it was a very popular virus scanning program from GFI, a company with no affiliation with Samsung, that was indicating a “false positive” warning. GFI admitted this error in short order.
In the past this story would have never been released in the first place as the explanation would have been available before the newspaper article was printed or the TV and Radio news report produced. Not so in the Internet age.
Sunday, April 3, 2011
My G is Faster Than Your G!
The battle wages on. On TV, in newspapers and magazines, and on bill boards on the interstate you can’t miss the ads touting 4G networks. If you believe all the claims by now you must feel that if you don’t have a 4G phone you are really missing out. What you are missing other than the letter at falls between “F” and “H” in our alphabet is less than clear. It might be interesting to know that the companies that tout the 4G networks really don’t know much more than you do.
Some history will help explain what all these “Gs” really mean. Back in the dark ages of mobile phones when the smallest of the available devices was briefcase-size, they used a network dubbed “1G.” The “G” stands for generation and this mobile phone network technology was the first generation. It was developed in the early 1980s and was fine for the analog devices in service at that time. It did require phones with protruding antennas.
The 1G networks were soon replaced by 2G, the first of the digital networks. With the number of mobile phone users exploding in the USA and around the world, the old analog system just could not handle the traffic. 2G systems could accommodate 50 or more simultaneous conversations on the same frequency and allowed for smaller phones with built in antennas. It was not, however, capable of efficiently handling data.
As more and more people wanted to be able to be connected while on the go, not only with voice but with email, the web, navigation services, and now social networking, the carriers like Verizon and AT&T needed a revolutionary upgrade and that resulted in the 3G network.
This brings us to the present and the 4G networks being touted by these same big carriers. The official definition of the capacities of the all the “Gs” is set by The International Telecommunication Union, the global wireless standards-setting organization. They have determined that 4G networks must be capable of download speeds of 100 megabits per second. In reality none of the carriers are achieving anything close to this speed. In most cases they provide speeds less than 50% of real 4G. For sure they have fast networks, but no cigar, no 4G.
Not to be deterred, the marketing gurus from many of carriers seem to have decided to collectively ignore the official definition and develop their own. Perhaps this is not a big issue when you are talking about bits and bytes. I do wonder what would happen if this trend carried over to BP or Shell. Could a gallon of gas be redefined by the gasoline companies as 14 ounces?
Some history will help explain what all these “Gs” really mean. Back in the dark ages of mobile phones when the smallest of the available devices was briefcase-size, they used a network dubbed “1G.” The “G” stands for generation and this mobile phone network technology was the first generation. It was developed in the early 1980s and was fine for the analog devices in service at that time. It did require phones with protruding antennas.
The 1G networks were soon replaced by 2G, the first of the digital networks. With the number of mobile phone users exploding in the USA and around the world, the old analog system just could not handle the traffic. 2G systems could accommodate 50 or more simultaneous conversations on the same frequency and allowed for smaller phones with built in antennas. It was not, however, capable of efficiently handling data.
As more and more people wanted to be able to be connected while on the go, not only with voice but with email, the web, navigation services, and now social networking, the carriers like Verizon and AT&T needed a revolutionary upgrade and that resulted in the 3G network.
This brings us to the present and the 4G networks being touted by these same big carriers. The official definition of the capacities of the all the “Gs” is set by The International Telecommunication Union, the global wireless standards-setting organization. They have determined that 4G networks must be capable of download speeds of 100 megabits per second. In reality none of the carriers are achieving anything close to this speed. In most cases they provide speeds less than 50% of real 4G. For sure they have fast networks, but no cigar, no 4G.
Not to be deterred, the marketing gurus from many of carriers seem to have decided to collectively ignore the official definition and develop their own. Perhaps this is not a big issue when you are talking about bits and bytes. I do wonder what would happen if this trend carried over to BP or Shell. Could a gallon of gas be redefined by the gasoline companies as 14 ounces?
Subscribe to:
Posts (Atom)