Feeds:
Posts
Comments

Archive for November, 2010

The Lawrence Berkley National Laboratory estimates the cost of power outages in the United States at over $80,000,000,000 (that’s $80 Billion) each year. The major outage in 2003 that blacked out large portions of the North East is estimated to have cost the city of New York over $1 Billion (or $36 Million per hour) and has an estimated overall cost of over $6 Billion (these figures include direct costs and indirect costs such as loss of inventory due to spoilages, lost business revenue etc). Clearly our aging power grid poses real risks to our economy.  Given our reliance on computers (which require electricity) for everything from war fighting to financial transactions to public transportation, it’s also easy to see that our aging infrastructure poses a real threat to national security. Our government has already discovered concrete plans by terrorist organizations to target our electricity grid.

So how can IT help? Broadband technologies coupled with cutting edge demand management software, outage monitoring sensors and improved data storage &a analysis can improve both the cost effectiveness and security of our energy grid. Currently the grid is a one way street. Energy flows from one of the thousands of utility companies to one of the millions of homes and business across the US. Each month the usage of that home or business is tabulated/read (either over a network or by a meter-reader) and a bill is prepared. When an outage happens, the utility is slow to find out and slow to respond.

In the future, a smart grid can help in all these areas. A smart grid turns our one way street into a two-way mutli-lane super highway. Electricity flows from the utility to the consumer, but also can flow from the consumer back to the grid (think solar panels, windmills, and electric car batteries) allowing for more effective peak demand management and creative revenue models.

Demand management software can keep the generation of electricity optimized. Currently, peak demand (or critical peak demand) requires utility companies to purchase expense electricity on the spot market or to fire up old, outdated, expensive, and heavily polluting plants that are kept only for emergencies. New software can help both customers and utilities manage demand by issuing load control commands (imagine the utility remotely turning off your air conditioner) and voluntary demand incentives (imagine getting a text message asking you to turn off your A/C in exchange for a credit on your bill). All of this can be automated and optimized.

Remote sensors along the grid can provide real-time information to the utilities companies about the health  of the grid. The 2003 blackout was caused when one of the high-capacity lines in Ohio became over-taxed, causing it to heat up and sag (as the lines carry more power they get hotter which causes them to expand and sag). This particular line sagged too close to an untrimmed tree resulting in a “flashover” that cause and ambient surge. Ultimately the surge caused a cascading blackout affecting millions. A remote sensor on this part of the grid could have alerted the local utility (in this case FirstEnergy Corporation) and the line could have been shut down or throttled. (I’m intentionally ignoring the fact that better landscaping could also have prevented the blackout).

Finally, better data management through the smart grid can mitigate the effect of outages and reduce their overall impact and cost. Outage Management Systems (OMS) can proactively monitor a company’s network and take action in the event of an outage. This will decrease response time and scope, leading to faster restoration of service.

IT is needed every step of the way to bring about a future smart grid. With over 3,000 utility companies in the United States interoperability standards will need to be developed (my guess is IP, but it could be something entirely different). Additionally, technologies for the consumer will need to be developed. I’ve had the privilege of testing some of the most cutting edge load control devices, thermostats, and in-home displays, and the information they provide will truly change consumer behavior. This provides a win for our economy, a win for the environment and a win for national security.

The National Broadband Plan, the 2008 stimulus bill, and other legislation has already highlighted the benefits of a smart grid and provided some early funding. Large scale programs are underway in TX, CA, FL and other states and soon, the technologies will be deployed in CT. The future is uncertain, the potential is great.

Good Talk,
Tom

[Sources: http://en.wikipedia.org/wiki/Northeast_Blackout_of_2003, http://www.elcon.org/Documents/EconomicImpactsOfAugust2003Blackout.pdf, http://www.lbl.gov/Science-Articles/Archive/EETD-power-interruptions.html ]

Read Full Post »

Recently the federal government released something called the National Broadband Plan that lays out the framework for a national effort to ensure that, as a country, we are making the most of available broadband technology. One of the challenges the National Broadband Plan hopes to tackle is the synergies between health care and IT. The plan is comprehensive in nature and lays out 5 key elements that will allow our health care providers to utilize current and future IT trends to provide better, cheaper, and more efficient care.

Ensure Access to Affordable Broadband by Rural Health Care Providers – the plan suggests that the FCC make use around $400Million in annual funds already authorized. The concern is that commercially available broadband is often priced too high to be affordable, or is simply insufficient to support modern health care needs.

Create Economic Incentives to Encourage Broad Adoption of IT and IT innovation – this tenant of the plan aims to increase the use of innovative IT products to improve the overall health care system. It supports Electronic Health Records and E-care. This part of the pan recognizes that broadband connectivity alone is not sufficient to get the most out of the current and future IT environment.

Use New Techniques in Data Analytics – Given the mass amounts of data generated and stored by modern IT systems, a move to broader adoption of health care IT will allow providers to make use of advanced analytics techniques. Spotting patterns in data may lead to better treatment and new cures.

Revise Standards around Licensing, Credentialing and Privileging – Many of these rules were written in the 20th century in a time before broadband and the technology of today. The National Broadband Plan aims to re-write these rules to speed the adoption of E-Care.

Protect Privacy – This is probably the biggest issue in Health Care IT. We will not see widespread adoption of electronic health records, E-Care, or health care IT until providers and patients are comfortable with the standards of privacy. In a time when we expose more and more information about ourselves online, most people are still not willing to share their medical information with anyone other than their provider and their insurance company.

These are the five major points of the National Broadband Plan as it relates to Health Care IT. In other posts I will examine the plan as it relates to other areas of national interest.

Good Talk,
Tom

Read Full Post »

If you read my recent post about corporate IT policy (and my employers refusal to allow Chrome) you’ll know that I’m not a big fan of Microsoft Internet Explorer. I find it to be slow and somewhat prone to security holes. I found it interesting that back in July IBM made the decision to use Mozilla’s Firefox as its default web browser for employee computer. In the announcement Bob Suter, IBM’s vice president of Linux and open source software, called out five key reasons for the move:

  • Firefox is stunningly standards compliant, and interoperability via open standards is key to IBM’s strategy.
  • Firefox is open source and its development schedule is managed by a development community not beholden to one commercial entity.
  • Firefox is secure and an international community of experts continues to develop and maintain it.
  • Firefox is extensible and can be customized for particular applications and organizations, like IBM.
  • Firefox is innovative and has forced the hand of browsers that came before and after it to add and improve speed and function.

The move seems to make sense for IBM, but what does it mean to the larger business community? In other words, why is this important?

First, it means approximately 400,000 employees of IBM will now automatically have Firefox installed on their machines. This is a fairly significant number of users and demonstrates that Firefox can be an appropriate enterprise-wide solution for web browsing (and that there are alternatives to IE).

Secondly, it is a big win for open source standards. Mozilla has kept Firefox steadfastly open-sourced with excellent documentation and wonderful security. As more and more applications move to the cloud we are all going to come to rely on open source standards both for security as well as interoperability and ease of use. IBM’s adoption of Firefox is a big step in that direction.

Ultimately this move should benefit IBM employees with faster browsing and a better online experience. But, I think the move benefits all of us as this “endorsement” should spur interest and faith in open source solutions for other enterprise tools.

Good Talk,
Tom

Source: http://www.sutor.com/c/2010/07/ibm-moving-to-firefox-as-default-browser/

Read Full Post »

The Army is rarely thought of as an organization relying on IT. We’re more apt to associate the Army with mortars and guns and basic training, but the truth is that today’s Army is a huge purchaser and user of Information Technology. Whether it’s campaign coordination by the generals or real-time data updates to the soldiers on the ground, the increasing use of IT has become a definite advantage in the Army’s war fighting ability.

Recently the LT. Gen Jeff Sorenson, the Army’s Chief Information Officer, put out a memo announcing the Army’s new policy on procurement and implementation of software solutions. The focus is around ensuring compatibility of systems across all Army units. The memo spells out a Common Operating Environment (COE) for the entire US Army. The memo states “Implementation of COE will decrease the time it takes to deliver relevant applications to the war fighters who need them, and decrease the cost of doing so.”

The COE policy is part of a larger Army policy called Army Software Transformation (AST) that aims to move the Army’s software procurement, implementation and management policies to the cutting edge and make our war fighters even more effective. Other aspects of AST include improved email (a move to Exchange Server), Active Directory, Enterprise Network Operations, Data Center Consolidation, and a move toward Agile development.

From General George Casey, US Army Chief of Staff: “We’re building an Army that is a versatile mix of tailorable and networked organizations operating on a rational basis…to provide a sustained flow of trained and ready forces for full spectrum operations…and to hedge against unexpected contingencies…at a tempo that is predictable and and sustainable for our all-volunteer forces”.

The new IT policies are helping to improve today’s Army and make our soldiers more effective in fighting the new kinds of conflicts we’re seeing in Afghanistan, Iraq and around the world. I’m confident our soldiers can rise to the occasion and create a world class IT organization.

Good Talk,
Tom

Sources:
http://www.federalnewsradio.com/index.php?nid=35&sid=2116410
http://ciog6.army.mil/ArmyEnterpriseNetworkVision/tabid/79/Default.aspx
http://ciog6.army.mil/LinkClick.aspx?fileticket=j4DkCajsfGQ%3d&tabid=79

Read Full Post »

Mobile smart phones are clearly one of the fastest growing technologies of the last decade. The ubiquitousness of Blackberries, iPhones, Droids, Evos and others have, in many ways, made life easier for everyone. We are now in near constant contact with clients, colleagues, friends, and family. From a productivity point of view this is a very good thing (we’ll leave the psychological and emotional discussion until another time). We can be much responsive to the needs of those around us and we can be much better informed than at any time in the past. However, there is one issue that is beginning to creep up in companies around the world.

Most IT departments, especially at large companies, have a fairly well defined policy about the use of the companies’ network resources and what types of behavior are permissible and what are prohibited. These policies vary from company to company and are enforced to vary degrees of effectiveness. I’ve worked with companies allow Facebook and YouTube at work, recognizing the needs of employees to take a mental break every now and then. I’ve worked at a company that allows only a predefined list sites necessary to conduct business (even sites like NYTIMES.com and Yahoo.com were blocked). And then I’ve worked at companies that fall somewhere in the middle (i.e. no business use for YouTube, but you might want to buy a gift for a boss/coworker on yahoo shopping). In all of these cases, my conversations with IT managers have made clear the policies are well thought out and consistent with the culture and values of the company. All offensive content is blocked at 99% of the companies I’ve worked with (1 had no web-filter at all and a very open culture).

But how do you deal with the little computers in our pockets that run on a cell phone signal? When employees can bring their own network to work, the risk to companies is much higher. Clearly, IT can manage the risk of viruses/malware to the network (after all, my Droid does not interact with my clients network very often). This risk is fairly straightforward and familiar to IT managers. But what about the risk of displaying offensive content? IT can’t filter the browser on a privately owned smart phone. I’m sure we all know colleagues that have called up the latest YouTube sensation at lunch or on a break. I’m a sure a couple people even know colleagues that have shown pornography at work (thankfully, I’ve never been in that situation). From an HR point of view the risk of a sexual harassment claim or a hostile work environment claim (think offensive jokes, videos, etc) is increased by the increasing presence of our smart phones. Additionally, most of these smart phones have cameras. There is a real risk of employees photographing confidential information.

So what is the answer? Honestly, there is no easy answer. I know a couple employers that ban cell phones inside all their buildings (mostly employers involved with classified government work). This seems like an extreme measure and for some industries (sales and consulting come to mind) would cripple your work force. If there is a simple technology solution, I’m not aware of it. I think the solution will be a mix of carefully enforced policy and a culture of respect in the workplace. I, for one, will be interested to see how it plays out in the future.

Good Talk,
Tom

Read Full Post »

Until 2005 the United States imposed rules on common carriers (legacy telecom companies such as AT&T and Verizon) requiring them to sell bandwidth on their networks to other ISPs at discount prices. The idea was basically that the best way to ensure competition in the ISP space was to make incumbents sell their bandwidth to other ISPs at wholesale prices. This would allow the Earthlinks of the world to compete and, hopefully, create an environment where increased competition leads to lower prices and better service for the consumer. However, in 2005, the telecom companies successfully lobbied the FCC to change the rules.

The carriers argued that the rules unfairly penalized them simply because of the delivery mechanism they used. AT&T and Verizon delivered data services over old telephone wires. Because, as telephone companies, they were designated common carriers, the rules were different. Comcast, on the other hand, delivered data services over coaxial cable and was not subject to the FCC’s rule about wholesaling bandwidth. In 2005, the SEC agreed with the legacy carriers and dropped the rule requiring discounted prices.  Some say this decision by the FCC provided a rallying point for the net neutrality movement.

Other countries such as South Korea and Japan have largely maintained similar bandwidth sharing rules to the pre-2005 rules in the United States. One can question whether this is the reason some of these countries have much wider broadband adoption, much faster broadband speeds, and much lower prices. Should the FCC have maintained the pre-2005 rules? Did the rule change spark the call for net neutrality? Should Comcast be regulated as a common carrier? All questions worth exploring.

Good Talk,
Tom

Read Full Post »

I’m taking a break from my series on net neutrality to discuss IT policy at the company level (as opposed to the national/international level). Recently my firm was acquired by a much larger company and over the last week or so we acquired employees have been working through the elements of the transition. While both companies have been excellent in communicating changes and keeping a positive attitude throughout, there have definitely been some “hiccups”.

As a consultants our primary concern is always client service. The merger, to the say the least, has been a distraction in serving our clients. We push through, but there is a noticeable impact to our clients as our time and effort is divided between their work and our own transition. This impact has been exacerbated by a slew of technology changes we must handle. We’ve gone from using Microsoft Outlook, a simple, easy to use and functional email program to Lotus Notes, a program without a major overhaul in 20 years. Our corporate IT policy does not allow Outlook and has stifled our creative efforts to use it (DAMO and other tools). Additionally, corporate policy mandates the use of Microsoft Internet Explorer and actively scans for installs of other browsers to delete them. Anyone who has used Chrome knows how much faster, easier and more intuitive it is. Lastly, we have an auto-backup program that backs up our machines on a daily basis. This is a great idea. However, the backup can not be scheduled person by person, it is randomized to efficiently utilize network resources. This can be a really big hassle when a consultant is using his/her machine to deliver a PowerPoint presentation and suddenly the backup utility kicks on and zaps the CPU/memory of the machine.

Each of these policies was set up with noble intentions. Outlook is not as secure as Lotus Notes and Notes has phenomenal back-end DB capabilities. IE is the corporate standard for browsers and minimizes compatibility risks. Regular backups are an essential part of any corporate risk mitigation/disaster recovery plan. However, the consequences of these policies have, perhaps, not been fully explored. Moving 600 consultants used to Outlook, Chrome, and on-demand backups to these new policies has severely impacted our ability to help our clients. We spend more time re-booting, more time fighting with our email and more time waiting for web pages to load. All this means we have less time (or less sleep) to help our clients. (This doesn’t even include the consultants who were forced to give up their Macs for ThinkPads.

When designing IT policy (or integrating a new company with conflicting IT policies), it’s important to keep the ultimate goal in mind. When policy or process impedes progress, a company must seriously re-evaluate. Now, it’s been less than 2 weeks and I’m sure the learning curve is steep, but this a good example of the impact IT Policy can have on a business and it’s customers.

Good Talk,
Tom

Read Full Post »

Older Posts »