Wednesday, December 18, 2013

Is your website ready for 2014?

If you are in business in any way, you use the web to interact with your people. Clients, prospects, vendors, and employees -- they are all more effective when they use common, web-based tools to access information. That information flow is vital to the growth and sustainability of businesses of all sizes.

As the year draws to a close, and all eyes turn to the plans for 2014, updating your company's website should make the short list of pending budgeted expenditures. You probably have a website, and an existing URL (Universal Resource Locator) with some good information about your company, and how to contact the right people to get stuff done, place an order, check on an order, talk to someone about a sale, email, or post a kudo or complaint. These functions should be as prominent on your new site as they are on your existing site.

Where your current site is most likely lacking will become more apparent in early 2014. Buyers are going mobile. This is not a new trend, but I believe the 2013 shopping season will provide a true tipping point for mobile adoption. Tablets, phones, phablets, convertible tablets/laptops, and a bevy of wearable tech -- these items comprise the “wish list” for many Santa-seeking gift receivers.

The hardware trickle down theory also applies. As people get the "latest and greatest” toy or device, the last REV ushers a whole new wave of adopters into the mix. On the home front as well as the office, many new users will become truly mobile capable in the first quarter of 2014. I know in my company and home -- as I upgrade -- all my old tech steps down a level, enabling some new function or gadget for the “new” user of my “old" tech.

Some statistics:
  • Fifty-three percent of adult cellphone owners use it to access the Internet
  • A sale coming from mobile phones on eBay comes every two seconds
  • Twenty-eight percent of all Internet usage comes from a mobile phone/device
  • Smartphone sales have become bigger than PC sales
The numbers point to as much as a 60% saturation rate among 18- to 24-year-old users (buyers), as engaged mobile platform users. That is a true tipping point in the market -- more than half of all users expect your website to be a working tool on a mobile device.

The outstanding question is simple: 
How do I ensure my site will work and respond on a mobile device?

As a first step, a "mobile-optimized" site must be rolled up. Google penalizes sites in organic search results when the target site is searched for from a mobile device. This means if you are in the cat food business and you are triggered as the “winning” search response (based on keywords, location, and user generated content/reviews), and you DO NOT have a mobile-optimized site, your next closest competitor with a mobile-optimized site will return as a lead response. Ahead of your listing, in the organic (native) search index.

Reworking your site, as a mobile-optimized resource, is an involved process. Content must be reworked, made presentable on a screen as small as 3.5 inches, and made navigable with gestures rather than clicks. The ability to navigate a maze of information with swipes, touches, and squeezes, is imperative to the project success. Most sites are dependent upon a traditional three point interface (mouse, keyboard, monitor), not a touch interface.

As you work with your integrator, or in house marketing team, be sure to reference a site plan that includes true adaptive/responsive site design tactics. The people/time/money invested will show returns in some unlikely places, as your rally point for information about your company embraces a new wave of mobile buyers and partners.

Wednesday, November 13, 2013

Cryptolocker Virus Can Decimate a Business

In recent weeks, a new breed of virus has emerged: truly effective “Ransomware." Past iterations of Ransomware, most notably the “FBI Virus” have touted an urgent call to action, involving a transfer of funds, to the group that has infected your computer. The FBI Virus is removable; with the right geek involved, your data was not in jeopardy, the virus could be removed, and your computer returned to a functioning state. The current version of Ransomware, known as “Cryptolocker,” is no joke. Your files are truly gone. The only path to restoration involves paying the ransom, via Bitcoin.


So far, the standard rate for retrieval of your information is two bitcoins. Standard bitcoin exchange rates, averaging about $300 USD per bitcoin, have applied. In the majority of the cases reported, the paying of the ransom actually works. About a thousand PCs a day are infected, though this rate may vary in the future, as known successes are reported in the viral community.

Paul Ducklin, head of technology for the Asia-Pacific region at SOPHOS, described the impact of the virus: “It's kind of like losing your computer or smashing your hard disk or dropping your computer in the harbor. You are never going to get your data back.”

Let me be clear, removing the virus does NOT gain access to your data. The information is locked, not accessible, coded to a specific and complicated key. Cracking the key is a job for geeks armed with supercomputers and dual MBAs and is well beyond the scope of the average SysAdmin or corner computer geek. It is no joke, no post-Halloween spoof. Infected file types range the gamut of often used and critical files, and include documents, databases, spreadsheets, photos, videos, and music collections.

The virus is also written to lock “backed up” versions of the files. The operative function of cross platform infection pivots on real time links between data structures. If you are using a common and widely adopted platform such as Dropbox, iCloud, Google Drive, or SurDoc, then your file storage may also be locked. During the past decade, users have moved away from a truly “cold storage” approach to file protection, but that old-school approach is the best way to combat the current malware attack.

PCs and laptops that are attached, via VPN, or other linked mechanisms are also prime candidates for this viral attack. The virus authors are looking for files that are easy buy decisions for the unlocking keysets. A database of client information, current AR, banking info, ERP and CRM tables, etc. The data types targeted are obvious. This corporate infection strategy is reinforced by a development (in the past week) of a “buy more time” option in the ransom scheme. Initially, a 72-hour window was given, demanding two bitcoins for the unlock key. Now, a “buy more time” option has been given. The upcharge can range into the thousands of dollars (USD), allowing for more time to gather funds to purchase unfettered access to your files.

For personal computer users, an image backup of cold file storage of the to-be-protected data is recommended. USB capable storage is cheap. 32GIG flash drives, in the $30 range, are prolific. I would recommend backing up crucial data as often as needed. The question is, how many days can you afford to lose? Backing up your primary information monthly, at a minimum, is recommended. In the past, corporations and enterprises have used tape backup, some form of shelved cold storage, ensuring information integrity.

Returning to the trends of old, keeping a “cold copy” of primary data is critical. Even when the ransom is paid, there have been cases where the key is not provided. Ironically, the absence of a key, despite ransom payment, is laid at the feet of law enforcement. Should the authorities track the key generating server and shut it down before your key can be generated, your data is still locked.

Monday, November 11, 2013

Sticker Shock: ICAAN's New Rules for TLDs

Creating a website -- currently a fairly inexpensive proposition -- is about to become a lot more costly and complicated.

Nowadays, domain names, starting as low as $3, are readily available and traded openly. Typically, staking a claim to a specific URL or domain name does not present a financial barrier to entry. In fact, even the least technical businessperson can tackle website creation, although hiring a professional will doubtless deliver higher quality results and should take less time.
Regardless of size, financial status, marketshare, or talent, the web offers an inexpensive and readily available platform for users of all types. But volcanic change threatens the equality Tim Berners-Lee envisioned when he crafted the Internet.

Most domain names end in .COM and .ORG; in a “top 100 sites” listing by Ted.com, .COM accounted for half of the primary list. Another 39 used .ORG, while a mix of other top-level domains (TLDs) rounded out the ranking. Under ICANN's new rules, new and distinctive domain names will be released, after review by ICANN. These names will begin to become visible in the greater web in the fourth quarter of 2013.
A highly visible top-level domain, one of the first to gain approval and plan for market release, is.NYC. According to the City of New York’s information page about the .NYC TLD, pricing for these domain names will be based on estimates of related revenue, a model lending itself to more of an auction-based feel than the traditional domain pricing model.

Qualification for acquisition of a .NYC-related domain is also limited. According to City of New York officials, qualified applicants will have a “bona fide presence" in the City of New York, meaning regularly performing lawful activities within the city and maintaining an office or other facility in the city. This is a drastic departure from traditional domain acquisition practices, where anyone with a few dollars can start a named web entity, regardless of location or taxpayer status.

A TLD differs from a traditional domain in its length and its price. New generic top-level domains (GTLDs) change the playing field, allowing for different associations between companies or cities and their web presence. Essentially, the introduction of this new class of domains separates the big players from the little guys. I question not only this fundamental change in the way the web routes naming conventions, but the basic class-level distinction ICANN is allowing to emerge.

You can find the reasoning for this move by looking at ICAAN’s bottom line revenue. According to its FY13 Operating Plan and Budget, ICANN's total operating income was $14.2 million in 2012. However, the organization predicts income of $85.9 million only 12 months later. Six times the operating income, year over year, is enough to entice any company into a new direction. But I question at what cost this increased revenue will come.

If a wave truly becomes a sound at first hearing, then a product becomes a reality at first adoption. Based on the thousands of initial registrants for GLTDs -- despite the massive cost per registrant -- the market apparently believes this new TLD strategy will succeed.

Esther Dyson, founding chairperson of ICANN, states GLTDs "will create jobs [for lawyers, marketers and others] but little extra value." In 2012, the most expensive GLTLD, CDN.net, sold for $185,000, a clear separation from the traditional pricing model for domain names -- the cornerstone of commerce on the web.

The WWW has (for the most part) provided a level playing field for all entrants, regardless of size. True impact has been determined by the masses, nullifying the law of scarcity, and allowing for viral growth of brands and goods based on value -- not marketing might and size of company. How the new GTLD schema will affect search engines, ability to find products, and baseline competition on the web is still largely an unknown. Time will tell as the new web postfixes hit the digital street. Hopefully the final vote, rate of adoption by consumers, will hold the most weight.

Don’t Look Now... Google Glass Will Change Everything

Google Glass is coming, and it’s a large target for many points of controversy.

Given the open architecture Google traditionally uses, the applications for this product will continue to cause much debate while reaching for new heights of integration and digital layover into our daily lives. Competitive wearable-technology products by the likes of Leap Motion and SixthSense, and Kickstarter-based offerings entering into this 21st century gold rush will soon flood the marketplace.

The underlying issue for business of all sizes is how this new technology will impact revenue cycles and buying trends. It’s an issue that has been ignored, surprisingly. But I believe we will see a new trend across all markets and all products, based on this emerging technology.

Prepping your business for this new tech is crucial. The crux of the matter lies in availability, from a digital perspective, to the information that is crucial to buyers. Metrics like competitive analysis, up to-the-minute availability statistics, and auction-based pricing (real-time supply and demand) must be available to the consumer or prospect in order to be accessible to the wearable tech public. The appetite for real-time info will be voracious. Consequently, relying on people, not systems, to update relevant information about products and services will be a fool’s errand.

Welcome to Tommorrowland
I live in a tourist destination, near a hub of international activity -- International Drive, in Orlando, Fla. It’s a mash-up of languages and product offerings ranging from the bizarre to the predictable. I can find them all just a few miles from my front door. I expect, in the near future, wearers of layover technology -- aka Google Glass -- will peruse products and services without ever entering the purveyor’s place of business. Attraction-queue wait times, table wait times, dinner specials, arrivals and departures of local transportation shuttles, room specials -- all types of goods and services being offered by business will be displayed digitally. From afar a prospect may be lured or qualified without ever entering the business in question.

The delivery mechanism for this information will be web-based tech. Regardless of how you build your site (PHP/CSS, WordPress, Twitter Bootstrap, Dreamweaver, or Adobe Suite) all sites will end up with some sort of consumable HTML code structure. Making sure your info is consumable by a wearer-style interface will be crucial. Supplying information that is relevant and current to the user, even more so.

From movable to wearable
Currently this can be achieved by a "mobile capable” web extension -- capability that is already an issue in Google’s organic SEO ranking. But the missing link for many businesses lies in the capture of relevant sale information and the automatic updating of inventory/availability metrics to the mobile capable interface. This may entail tying in a node from a host- or hostess-stand POS (point of sale), enabling wait time information for a restaurant, actively pulling data from a booking engine for demand-rate structure (resort room pricing), demand-based pricing for retail outlets selling hard goods, or service wait times for hair and nail salons.

Traditionally a staff person provides this information to a prospect walking in the door. The prospect inquires about wait time, the hostess answers. This human-to-human interface is lost in the layover marketing world. The opportunity no longer exists. It’s replaced by a visual layover that is only visible with wearable tech.

The tourist market is a prime example of what is coming, but all markets will feel an impact. A layover tech user may get real-time competitive info by looking at a car in a showroom and seeing a list of all competitive offers within five miles. Patients may query information about drugs prescribed, related healthcare information, or wait times in a specific practice’s office. Concerned parents may "check in" on their children at daycare or find their child’s location at a family outing. The implementation of this new technology surpasses privacy concerns and digital tracking.
It will, in fact, change everything.

No Shortcuts Allowed in Website Design

There are some tried and true ways to help prevent against disasters such as that which has befallen healthcare.gov.

Yesterday, in Why HealthCare.Gov Crashed & Burned, & How You Can Avoid That Fate, I touched on the management perils involved with so many contractors as well as site availability concerns. Today, let's look at the difficulties involved in accurate forecasting and dealing with siloed information.

Predicting the number of users for your companies' newly released tools should also be a simple calculation. How many potential users exist? Are you replacing an existing paper-based process? Delving into the existing number of steps and tasks completed daily, with reference to the process, should allow for a "best-case scenari" when launching a new interface or process online. A number of tools exist, allowing for "on demand" scaling of web and LAN/WAN based applications. Your "Go To" geek should be able to address scaling issues with ease, if expected adoption metrics can be estimated.


Dependencies between data silos, a key to failure for healthcare.gov, is a tough nut to crack. Traditionally, paper-based process allow for data staging between steps, normally managed by a well trained employee or team manager. Housing data in specifically crafted spreadsheets or disparate databases allows for a managed flow of information between process steps. Automating these steps well relies heavily on discovery and scope of work for the project. Rules, segmented into a number of "if-then-else" statements, must be carefully crafted. Allowing for the majority of contingencies and access to needed data in real time, taking into account the number of systems included in crafting health care coverage, indeed presents a challenge.

When planning a process for your company, such as price quoting or inventory availability, you must ensure the data needed to calculate the needed information is available in real-time. Depending on a person or department to respond to a needed variable breaks the system of real-time responses to live users.

The bottom line, when outlining a process to be automated, is discovery, scope, and testing:
  • Carefully document the existing process prior to project scope (discovery)
  • Review the process, with the algorithmic replacement of well-trained staff (scope)
  • Diligently run the new tool through end to testing, with the existing process team (testing)
A server is the perfect employee. It does not need sleep or vacation days, does not get jealous when another (better) server sits next to it, and ironically does not need health insurance. Moving your traditionally departmental- or team-based processes to a highly available application or portal is a logical step for any size business.

Healthcare.gov, though imperfect, represents a positive step forward. Using current tools to solve historically tough problems is a tried-and-true method in the world of businesses of all sizes. Hopefully, Washington can continue to improve and modify healthcare.gov. Obviously it is a work in progress.

Why HealthCare.Gov Crashed & Burned, & How You Can Avoid That Fate

Friday, March 29, 2013

EMRs Must Work Together for ARRA to Succeed


Throughout their generations, medical records have promised us lowered costs, increased efficiency, and generally better healthcare. However, despite untold dollars and the efforts of some very smart people, we've yet to realize a fully electronic medical record.

Health information technology is far from a new concept. The first EKG machine, used by Alexander Muirhead in 1872, was designed to bring better medical solutions to ailing patients through technology. Today, technology and medicine are almost inseparable. Machines provide comprehensive data throughout the diagnostic path. Magnetic resonance imaging is an excellent example of the type of analysis that could not be completed without very powerful computers.

Yet most of this industry has yet to achieve full electronic sharing of data between patients, primary care physicians, referring physicians, pharmacists, and laboratories.

The Veterans Administration offers a shining example of what's possible. Its EMR systemreaches or exceeds the goals of digital patient record deployment on all levels. That's because the system's data -- each patient's record -- is usable and viewable by everyone within the patient care system. You have truly available data, from the patient's reporting of the chief complaint (at record origination) through the patient's discharge (or record close). All facets of care are encompassed within the same record and the same system.

In the open healthcare market, health information exchanges (HIEs) are supposed to handle this sharing of records across a common data structure. Regional extension centers were supposed to implement HIEs, which are an integral part of the ARRA/HITECH Act. But this plan has failed -- plain and simple. You can spread the blame widely, and many health IT players are doing just that.

Because of HIEs' lack of success, EMR implementations also are threatened, according to a December 2012 California Healthcare Foundation study. "Given the lack of tangible progress in expanding HIE capacity in California through federally funded programs... it is unclear how meaningful the widespread adoption" of electronic health records will be.

When looking at EMR adoption, ARRA/HITECH advocates cite a Medscape study from August (registration required) that found 74 percent of physicians are using these solutions. But there's a lot of disparity among systems, and a lack of leadership should concern IT professionals. According to the Medscape study, 22 percent of physicians said they use Epic systems, but an equal percentage chose the "other" option. Since no single package has a commanding marketshare, there's no common data structure across healthcare providers. Records are not necessarily accessible from system to system, meaning patients will find their information in silos across healthcare providers. Sound familiar?
No Front Runner in EMRs
Epic is the leader in the EMR market, an August 2012 report found, but there are many contenders.(Source: Medscape)
Epic is the leader in the EMR market, an August 2012 report found, but there are many contenders.
(Source: Medscape)
To realize the promises of procedural efficiencies, reduced costs, and improved care, healthcare itself must recognize something that's true across computing, regardless of industry, type, or size: It has to be easy to share data across platforms. For many generations, documents and spreadsheets have been portable across platforms and between users, irrespective of the relationship between the sharing parties. Any user, from novice to expert, can share media of all types, including videos. However, billions of dollars later, the infrastructure for one set of highly trusted users -- our medical professionals -- is still stymied by this challenge. Healthcare professionals still cannot share digital information between systems.

Frankly, I do not see a cure in the near future. But some sort of exchange medium must be found. We must concoct a common ground for sharing medical data -- for the health of the industry and, more importantly, our citizens.

Sixth Sense Tech + Open-Source = Exciting Opportunities

Sixth Sense Technology is an open-source alternative to Microsoft SurfaceApple's multi-touch, and the Linux MPX Multi Touch. It's also inexpensive and, I believe, mind-blowing and industry-changing.

When Pranav Mistry previewed this technology during an enthralling TED Talk in November 2009, he promised that, one day in the near future, developers would have access to the tools they needed to build their own "Sixth Sense" devices, open-source-style.

Sixth Sense uses hand gestures to control interactions as opposed to a mouse, trackball, or screen. Mistry's goal: To seamlessly connect the virtual and physical worlds in order to make the computing experience more intuitive. As a result, he wanted to remove the digital world from the confines of the smartphone or tablet and bring that world to everyday objects such as paper, walls, and t-shirts. "Rather than getting your camera out of your pocket, you can just do the gesture of taking a photo and it takes a photo for you," he said in the video.
The Future Is Here
In his TED talk, Pranav Mistry demonstrated his open-source-based technology that eliminates the need for typical screens or hardware, and uses 'screens' such as paper, wrists, or walls to display data.
In his TED talk, Pranav Mistry demonstrated his open-source-based technology that eliminates the need for typical screens or hardware, and uses "screens" such as paper, wrists, or walls to display data.

Recently Mistry lived up to his promise of sharing his powerful technology. The code and
hardware component list (HCL) for Sixth Sense are now available. Wearable technology, as defined by Apple, Samsung, Dell, HP, and Microsoft, looks much different than the world Mistry envisioned four years ago. By offering this technology for free and open-source-style, Sixth Sense is definitely not on the same roadmap that the big boys of high tech are traveling.

The components you need to build a Sixth Sense device are both off-the-shelf andinexpensive. They include:
  • Camera to capture the user’s hand movements and gestures, take pictures, and interpret users' “framing” gestures. Approximate cost: $50
  • Projector for output (GUI), giving the ability to use virtually any surface as a “monitor,” as well as augmenting physical objects with real-time content. Approximate cost: $250
  • Mirror measuring 1” x 1” as a physical relay for the projected image. Approximate cost: $5
  • Microphone, used to enable paper as an interactive surface by capturing sound across the paper medium. With current camera tech, the microphone is usually part of the camera apparatus. Approximate cost: $0
  • Mobile computing device -- smartphone or laptop. Approximate cost: Varies
That totals about $300 in hardware, all based on readily available, off-the-shelf components. That's not a huge investment, given the impact of the technology. Currently, no (or perhaps only a few) VARs are marketing Sixth Sense, although I predict this will change -- soon. Packaging the system with software to create a usable bundle of some wearable device at an inexpensive purchase price is inevitable.

Gaming platforms adopted gesture- and voice-based recognition, allowing for free-form game play and voice command for software manipulation. Microsoft, Apple, and BlackBerry accustomed users to control their devices via gesture and swipe commands. Now a new tier of tablet manufacturers can pull up a seat at the mobile table, given the low-cost barrier to entry and the potential market for vertical, specialized, inexpensive models, or diverse new wearable prototypes based on these open-source specs.

The table has been set for a truly intuitive platform to sweep the marketplace. I foresee the release of “Sixth Sense” as a free open-source alternative will have a big effect on an already explosive impact.

Wearable Tech Will Create BYOD Tsunami


The next big shift in IT's cycle will come strapped to our wrists.

We are already migrating into a landscape of people who are empowered with information-gathering and reporting devices. These portable devices, in order to be truly effective, require interaction with the cloud or the Internet. But today, these devices rely heavily on tethering, a trend that will not last given our insatiable demand for mobility and flexibility. Users, from top executives to students, will increasingly insist upon WiFi connectivity.
DIck Tracy Calling
Chester Gould set the bar high when he drew Dick Tracy and his high-tech wristwatch. (Source: Wikipedia)
Chester Gould set the bar high when he drew Dick Tracy and his high-tech wristwatch.
(Source: Wikipedia)
The vision of the future, as drawn by Chester Gould, included wearable tech not far beyond our current technological reach. Yes, we have the components necessary to make a Dick Tracy watch. For connoisseurs of the 1940’s comic strip, this is an exciting prospect.

Police Detective Dick Tracy, as drawn in the popular comic, sported a highly functional tool -- his watch. This device, which looked very similar to any other watch, featured capabilities far beyond the technology of the day, including videoconferencing, big-data search, and navigational guidance. In fact, it had components that exist, in larger form factors, in today's tech world.

Rumors that Apple will be releasing a device akin to the Dick Tracy Watch are surfacingacross the web, although Apple has not confirmed these stories.

Apple stores stock bands for its popular Nano iPod, which many users sport similarly to a wristwatch. Fossil released the Abacus AU5005, a wrist PDA running Palm OS, but it never gained any market traction. First released in 2004, it's possible Fossil could rerelease Abacus with the recent advent of BlackBerry 10. (See: BlackBerry 10 Faces Enterprise, BYOD Hurdles.)

Almost three years ago, another big tech player cited the comic strip in a widely reported blog post. Carl Taussig, director of IS at HP Labs in Palo Alto, referred to a watch in development in a one-line description: “We call it a Dick Tracy Watch." Since he posted this in May 2010, the HP product cannot be far off.
Handy, Dandy Device
People can customize Pebble's watch face, apps, and other features, according to the developer.
(Source: Pebble)
People can customize Pebble's watch face, apps, and other features, according to the developer.
(Source: Pebble)
There are smart wristwatch options out there, if you want one sooner than later. For example, Pebble Watch expects to ship its customizable, wearable tech, which lists for $150, next quarter (it's taking pre-orders now). Born from a KickStarter campaign, this small manufacturer didn't wait for the big boys to get to market, and tried to carve its own niche early.

Sony, which is a big company, has the SmartWatch, which lets wearers access text messages, emails, and status updates, as well as apps on Google Play.

One reason these intelligent, connected wristwatches will succeed is our evolving connectivity.
MediaTek, an off-brand smartphone manufacturer, is mounting a plan to release a no contract, $100-range smartphone. In fact, I've seen some reports that the smartphone market will settle into a $50, no-contract price.

Smartphones and other mobile devices built on an operating system with advanced connectivity options are becoming the norm, not the exception. Phones that look for WiFi, rather than defaulting to 3G/4G connectivity, are storming the market. As the saturation percentage increases, and as devices ship at a lower price point, business environments of all sizes will be challenged with wireless informal device support, also known as BYOD.

Corporate offices, traditionally protected by a hard firewall-based network edge, are being forced to recognize this micro army of devices. Incorporating the ad hoc user via tethered Bluetooth device or 802.x enabled WiFi is IT's reality. As I wrote recently, there are tools out there used solely to compromise networks via WiFi radio, tools that are accessible and free.

Understanding the impact of these devices in relation to governance, security, and accessibility requires a current look at adoptable and available technology.

Too often, when I broach this topic with clients, they greet me with understanding nods and placating statements. They get that "this is the future." Acknowledging that the future is here is rare. A real threat exists, not tomorrow, but today.

In nature, a tsunami builds far off the coast, spurred by an underwater shift in the landscape, far below the visible surface. On the horizon, you can't see any real change. Only after the massive wave has had time to build and gain momentum does the destructive force strike a coastline. Plummeting hardware costs, market innovators, user demand, and established standards constitute the digital plate shift -- and these events have already happened in our industry.

The tsunami of BYOD, in the form of inexpensive wearable tech, is about to hit the coastline of IT. Can your department swim?

Beware: This Is a Neighborhood Hack Community


When one of my team members recently responded to network-down call at a client's home office, he quickly established, beyond a shadow of a doubt, that the edge device -- a customer premise equipment (CPE) router -- had been compromised. Regaining control of the edge router proved difficult. A neighbor who was within range of the 802.11n wireless radio was actively hacking the router, overpowering the field tech with password resets in real time. In order to regain control of the network, the tech had to remove the antennas, allowing for LAN access only to the routing edge, and effectively cutting off the hack by cutting off his basic access.

When the field engineer recounted this story, I was skeptical. Traditionally an active hack on an edge device is something found only in corporate networks that house a treasure trove of corporate information. You know: credit card numbers, medical records, all sorts of information that someone can exploit for monetary gain. Traditionally, the sophistication level required to overpower an edge device has been out of reach to the "script kiddies" or low-level code miscreants.

My engineering meeting soon dissolved into a heated discussion about the availability and sophistication level someone would need to compromise CPE. We decided to pull the packet logs from the routing edge and sleuth out the user type, based on traffic type. The compromised network did not have any saleable information, only client contact files. There were no credit card numbers -- just your basic contact information, such as email addresses and cellphone numbers. Not exactly the family jewels, digitally speaking.

We determined the leveraged network was used as an access device for a gaming console -- with access only being sought after school hours. Based on the traffic type, we figured our hacker was a school kid playing Gears of War. In other words, a middle-schooler had bypassed what has been thought of as a sophisticated security paradigm. This kid had changed basic access passwords, altered the default IP address scheme had, and enabled WPA2 security. None of the traditional security tactics worked; this kid defeated them all, in real time.

Our answer to the client was simple: Upgrade your edge device to a more current hardware version, hoping to outrun the digital truant. The teen has not gained any access since the upgrade. Time will tell.
The question bothered me. How could a middle-schooler with an Xbox defeat an encrypted password set? Apparently, there are videos out there that explain how to do this. A security consultant, Pedro Joaquin of Mexico City, said he has sent these exploits to CPE manufacturers and, he warned, none are immune.
As a security professional, my visit to one exploit site sent a chill down my spine. It showed a tool set that can be captured and used offline from a tablet, gaming console, or cellphone -- anything that runs Java. More than 115 exploits are available, free, anytime -- on or offline -- from virtually any device with an interface and a processor.

In the past, my greatest concern for small and midsized business users has centered around their adherence to good password management, about changing the shipping default user name and password for edge devices, for example. Too often, the CPE that the broadband service provider delivers is not accessible to the end user or contracted IT resource. This fact plays into the hands of the nefarious user or digital thief, given this highly available tool kit.

In a vast conglomerate or midsized organization, think about all those employees who telecommute part time, who answer business emails from their iPads, or revise spreadsheet figures at the breakfast table on Monday morning. These days, who doesn't work from home sometimes? Which colleague might be sending confidential or sensitive data from his flimsy household network? Who's living next door, down the road, or across the backyard?

Taking your data to the cloud, and getting your primary data storage out of the LAN and into a more secure enterprise-based solution, is becoming more urgent. For many generations of data access -- a cycle that runs in dog years -- the web has not been a nice neighborhood. Data thieves, traditionally looking for saleable data, are around every corner. With the advent of easy to use exploit tools, the cancer of hacking has spread to your front door.

Adopting 802.11ac Standard Saves Money, Adds Value


This year's opening keynote speech at the Consumer Electronics Show (CES) broke the mold: Dr. Paul Jacobs, CEO of Qualcomm, delivered the first non-Microsoft presentation in a dozen years. Though Microsoft CEO Steve Ballmer did make an appearance, the presentation was dominated by a core technology for release by Qualcomm this year.

In fact, 802.11ac-supported products are across the marketplace, available not only by Qualcomm, but also by Broadcom, Marvell, Mediatek, Quantenna, and Redpine. The product offerings by this wide array of chipset manufacturers ranges from routers and tablets to a selection of handsets, a.k.a. cellphones.
Though vendors are highlighting many technologies at CES, the most easily adoptable and (in my opinion) most impactful technology is the 802.11ac standard. This latest rev of wireless tech brings some real flash-bang delivery to a highly adopted mobile user base.

The new 802.11ac standard has some dizzying tech-speak surrounding why it's head and shoulders above previous iterations. The geeks reading this can follow the 802.11ac for more information. For the less geeky, the answer is simple: 802.11ac supports more users with faster delivery of content. When dealing with tech, faster is better -- every time.

Though the 802.11ac standard has been out for more than a year, the adoption curve is just hitting the marketplace. Market offerings for routers ranging in price from $137 to $230 are available from AsusNetGearD-LinkBelkin, and Buffalo, a variety that announces 802.11ac is here to stay.

You can find the real punch in the devices carrying chipsets that leverage the 802.11ac connectivity gains. There's a range of quad core processing chipsets in a variety of devices that truly make this nominal investment in new hardware well worth an upgrade. Cellphones, tablets, smart TVs, laptops, and gaming consoles -- you name it, the device will have this chipset. The range of 802.11ac-enabled devices will grow as the year progresses.

The routing edge is not the only place to see an increase in functionality with a small price upgrade. If your laptop, smart TV, or wirelessly linked PC is in need of a faster and richer connection to the content-filled web, you can use an add-on dongle or in-box card (PCI). Ranging in price from $17 to $35, this inexpensive add-on tech, paired with an upgrade at the routing edge, is a real bang for your IT buck.

For the IT pros who are adding an 802.11ac-enabled Wireless Access Point (WAP) to existing networks, it's not a bad idea. After all, you can support all those newly purchased tablets, smartphones, and current generation laptops with a richer experience, without upgrading the routing core, with an 802.11ac WAP or wireless repeater.

A major gain within the framework of the 802.11ac standard includes doubling the number of real users supported by any given access point, avoiding previously encountered bottlenecks at the users' edge. Now that's an argument you can use to free up those tightly held IT budget dollars.

Though the New Year will bring many flashy tech gains, including 802.11ac in your first quarter IT plan is a huge gain at the point of integration -- the user experience. Why not leverage this new tech with a low dollar buy? After all, supporting 802.11ac is the right move for your users, providing speed and depth of user experience -- whether it's for a residential network, small office environment, or a campus-wide implementation -- for a small outlay.

Get Back: Returning to Windows 7


Whether it's because they like the stability of Windows 7, their existing investment in earlier third-party applications, dislike of Microsoft's latest operating system, or another reason altogether, some IT professionals don't want to run the Windows 8 OS that comes preloaded on their companies' new desktop PCs. Or perhaps they plan to run Windows 8 -- but not yet.

I've found that about 10 percent of new desktops are still available with Windows 7. Market leading PC vendors' websites show about six to 10 of their 40 to 60 offerings ship with Windows 7 preinstalled. This percentage will only drop as we move forward. I expect by the end of March only specialty orders will allow for Windows 7 preinstalled.

Right out of the box, I have Windows 8: How do I get back to Windows 7 if I'm not ready and willing to make that commitment?

Microsoft's Replacement Department, the starting point in this licensing quest, is prepared to handle this issue. After providing the call center with your Windows 8 product key and part number, the company will ship out Windows 7 Media and you'll be charged $30 for the media and shipping. In return, you'll get a boxed retail copy of Windows 7 Pro with a license key. The associated computer will then have a shipping Windows 8 license for later use, and a current and fully-licensed Windows 7 Pro desktop -- the best of both worlds.

For, after all, eventually your company will most likely upgrade to Windows 8. By planning ahead, you can eliminate some obvious training issues that occur due to the great differences between versions 7 and 8.

Press start
When you finally do begin rolling out Windows 8 to your company's users, be prepared to deal with the Start button. Or, to be more accurate, the lack of one.
For $5 per desktop, you can buy Start8, third-party software by StarDock, that adds this button back to the Win8 desktop. It's a cheap way to deal with the first issue you'll probably find related to Windows 8 training.

After all, training is the key to successful software adoption. Any technology must have the confidence of the users. If your user base does not believe in its ability to use the tools at hand, the software will fail, not due to bad code or poor installation, but for lack of use. Employees will fall back on older and what's perceived as "more reliable" technology to complete their tasks. At the end of the day, they must get their jobs done, regardless of current IT initiatives.

IT must do their job as well. Throwing users to the interface sharks is not a good path to building confidence in your organization's support team. Empowering users with new tools is part of the path to success for the business world. The real work for the IT sector is simple: Teach your user base how to be more efficient with the new tools. In the end, they will need less support if you provide them with advance training in new software or operating systems.

There May Already Be an App for That


There are so many applications readily available today, often for free, that jumping into mobile application development is a decision no company should take lightly.

Though it's sometimes difficult to recognize it from the results, even bad apps were developed to accomplish one or several goals. Typically, apps are born from frustration by a group of potential users -- whether they're in IT, sales, marketing, human resources, or elsewhere. Yearning for a more efficient process or a better result, an energized individual or group embarks on a solution to a specific challenge: providing timely information, repeating a known calculation, integrating a big-data grab and translating the information into a pertinent interface, or some other task that demands a better solution.

A good litmus test for any application development team is to search for something that could already solve your problem. If nothing is available, there could very well be a reason. Be on the lookout for a hidden challenge that could present itself during the development process. You may encounter a missing piece of technology, or you may need an expensive testing process. There could be an immature standard or an interface that is broken beyond all recognition.

After finding something that is close but not quite close enough, I suggest you analyze the near-miss app. What keeps it from providing the functionality you need? You may want to contact the app's developer to see if it can add or subtract capabilities. Remember that the goal here is to satisfy your need for information, fix a broken process, or find an operational efficiency. Building an app just because you can or because it is cool is not a real reason. Refer to your frustrated user. What is that user's pain?

If you cannot find a workaround, your next step is application development. Defining what platform to use -- the best piece of hardware for your application -- is a seminal step. You must understand the user's environment, mobility needs, and governance standards. Leveraging a platform like Facebook, Google, or YouTube speeds up your release date and cuts down on development costs.

The nuts and bolts of any application development rely on two major components: the application programming interface (API) and the software development kit (SDK). They allow your development team to interface with data and function on multiple platforms.

An SDK allows the front end to perform on multiple platforms -- iPhone, iPad, Android, Windows, or Linux. A good API takes all currently usable technologies into account.

SDKs give the data engines used by the front-end software the ability to share information safely and securely. All software projects consist of three components: backend data, middleware, and front end (or user interface).

Losing sight of the goal (the end user's pain) is often a byproduct of application development. When considering whether you really need to write an app for that, look carefully. You might find another frustrated user who has already met your need.

The Science of Selecting an IT Partner

Finding a good partner in today's world of bring-your-own-device, cloud computing, social network, rampant cyber insecurity, and governance concerns might seem a daunting task. That's because it is, plain and simple.

But selecting the best-suited solution provider is the best way to augment your (no doubt) over-burdened internal IT staff to work on deadline tasks, finding a team with complementary talents, or gaining access to skills you don't have in-house.

In many cases, IT departments took a big hit in early 2009 and have seen little growth since. Tightened budgets and limited IT purchasing dollars greatly reduced the need for internal technology resources.
Now funding is loosening and companies are hiring again, when they can find the right people for the jobs. As you begin your quest for well-qualified service providers, here are some guidelines for finding the right company to support your needs, now and in the future.

Vertical market expertise
Understanding the needs of your users, your company's governance issues, its known integration hurdles, and hardware requirements is as much about having already made crucial mistakes (hopefully at the test bench) as it is about general experience in integration. Different business models and units (sales, manufacturing, just-in-time, RFID, and cloud, for example) require different levels of partner and certification commitment. Having a partner that predicts your needs, wants, and requirements is much more likely if you locate and leverage an organization that understands your minimum working requirement because it knows your industry based on prior experience.

Understanding BYOD
BYOD is a current example of a technology path that is pressing and present in the support and integration market. A prospective partner that gives you the thousand-yard stare when you inquire as to referenceable BYOD adoption is not the right partner for your immediate and long-term needs. After all, BYOD is something most -- if not all -- midsized enterprises are considering today; no doubt it will be on your company's radar sooner rather than later.

Support team
At least 80 percent of all support calls are resolved with one of three questions: Is the failing device actually turned on; does your login give you rights to the resources you need to execute the program; and are you using the right password or does it need to be reset? It's only in the other 20 percent or so where you'll find challenges. Continuity of support relies on retained knowledge about very specific processes and architecture. If you're handing over employee support to a partner, make sure its best-practices and procedures match your own company's policies.

Knowledge sharing 
When interviewing prospective partners' current clients, ask: “Do they pass along needed information and user training as a regular course of support?” If your company adopts a new process or application, the end-user training should change each time. Users empowered with enough knowledge to use the tools at their disposal are easier to support and, in fact, need less tech support.

Innovation and successes
It's important to find a partner who has a track record of innovation, one who invests in learning and implementing the adoption of new processes and software. Ask the potential partner: “What is your most recent and greatest efficiency gain for the end users you support?” The answers can be most revealing.

Strategic partners 
No solution provider/integrator stands alone, and it's important that your partner demonstrates its technical prowess through vendor certifications, ongoing education, governance compliance certification, and other measurable forms. Select a partner that has an unbroken string of partner support resources. These vendor or industry partners should be relevant to your organization, and certifications should be verifiable.

Your integration partner should be a strong ally as you harness the power of technology to enhance business processes to become more competitive, more agile, and more profitable. Finding success and victory in digital efficiency is about working with a knowledgeable and professional IT partner every time.