Tag Archives: healthcare

Interesting observations on the healthcare system implementation

9 Dec

Many thanks to http://www.bespacific.com/ for forwarding this post. I often get funny looks when I tell people that the expected outcome may not be the desired outcome. With ones analyst hat on, it is easy to say this – one has a hypothesis, and one tests it. If the hypothesis proves false, then we have identified a place not to go, or a refinement in thinking. For an analyst “failure” (as defined below) is an option. For program managers, it must be an option, but one that is so hard to manage – generally the inability to address this issue starts at the top, and is framed within the culture of the organization.

As a project manager, one would think it is an option – identified as a “risk” in PMP speak, and addressed and managed as such. We will not know the details of what happened for a while, but the article below sheds some light.

HealthCare.gov and the Gulf Between Planning and Reality
By Irving Wladawsky-Berger
Guest Contributor, WSJ

 It’s way too early to know what really happened with the botched launch of HealthCare.Gov.  We don’t know how it will all play in years to come and what its impact will be on the evolution of the Alternative Care Act, on election results over the next few years, or on President Obama’s legacy. Depending on how it all turns out over time, this will be just a chapter in future books on the history of the ACA and the Obama administration, or the subject of major books and investigative reports.

 Most everyone who’s been involved with the development of complex IT systems knows how wrong things can sometimes go.  So, when serious problems do happen, we are eager to learn the lessons that might help us avoid similar problems in the future. It’s quite possible that HealthCare.gov and the ACA’s overall IT system are such complex outliers–technically, organizationally and politically–that any lessons learned might apply to few other projects.  But, given the increasing complexity of private and public sector IT systems, the lessons are worth thinking about.

 I like the way Clay Shirky, NYU faculty member as well as author and consultant, framed the problem in a very interesting blog, Healthcare.gov and the Gulf Between Planning and Reality.  He writes about the gulf between those charged with planning the overall rollout of the ACA and health care exchanges and the realities of trying to get such a complex system designed, built and launched in a short amount of time. It’s essentially a tale of failure is not an option versus the messy world of highly complex IT systems. While the blog is focused on the launch of HealthCare.gov, it can also be read as a more general discussion of the kinds of problems often encountered with highly, complex IT-based projects when a management decision to win a deal at all costs comes back to haunt the implementation of the project.

 “For the first couple of weeks after the launch, I assumed any difficulties in the Federal insurance market were caused by unexpected early interest, and that once the initial crush ebbed, all would be well,” he writes.  “The sinking feeling that all would not be well started with this disillusioning paragraph about what had happened when a staff member at the Centers for Medicare & Medicaid Services [(CMS)], the department responsible for Healthcare.gov warned about difficulties with the site back in March.”

 The paragraph responsible for Mr. Shirky’s sinking feeling was part of an October 12 NY Times article, From the Start Signs of Trouble at Health Portal. According to the article, the warnings came from CMS deputy CIO Henry Chao, the chief digital architect for the new online insurance marketplace. In response, his superior told him:

 “. . . in effect, that failure was not an option, according to people who have spoken with him. Nor was rolling out the system in stages or on a smaller scale, as companies like Google typically do so that problems can more easily and quietly be fixed. Former government officials say the White House, which was calling the shots, feared that any backtracking would further embolden Republican critics who were trying to repeal the health care law.”

 “The idea that failure is not an option is a fantasy version of how non-engineers should motivate engineers,” adds Mr. Shirky. “Failure is always an option. Engineers work as hard as they do because they understand the risk of failure.” In his opinion, neither technology, talent, budgets or the government’s bureaucratic processes are the main culprits here.  Rather, this is a management and a cultural problem.  As a result of the huge political pressures they were under, top administration officials did not feel that they could seriously address the possibility that things might go wrong.

 Other articles paint a similar picture, such as this recent one in the WSJ’s CIO Journal:

 “It was on a cold, sunny day in Baltimore last January that Curt Kwak, chief information officer of the Washington Health Benefit Exchange, first realized that the signature feature of President Obama’s Affordable Care Act could be in trouble.  That day, at a status review meeting of CIOs of state health exchanges, he learned that many of his peers were far behind where they should have been.  According to Mr. Kwak, several of his peers hadn’t yet selected a systems integrator – tech vendors who play crucial roles in fitting together the multiple components of health insurance exchanges that allow consumers to select and enroll in health plans.”

 Why did the administration, as well as several states, wait so long to start the planning of the ACA system including the health care exchanges?  Ezekiel Emanuel– oncologist, vice provost and professor at the University of Pennsylvania and former White House advisor on health policy–said in a good article on the subject that the administration did not want to release detail regulations and specifications on the exchange while in the middle of the 2012 election campaign in order to avoid political controversies. “This may have been a smart political move in the short term, but it left the administration scrambling to get the IT infrastructure together in time, robbing it of an opportunity to adequately consult with independent experts, test the site and fix any problems before it opened to the public.”

 But, then came the reality, which Mr. Shirky describes as the painful tradeoff between features, quality and time.

 “When a project cannot meet all three goals–a situation Healthcare.gov was clearly in by March–something will give.  If you want certain features at a certain level of quality, you’d better be able to move the deadline. If you want overall quality by a certain deadline, you’d better be able to simplify, delay, or drop features.  And if you have a fixed feature list and deadline, quality will suffer. . . You can slip deadlines, reduce features, or, as a last resort, just launch and see what breaks. . . That just happened to this administration’s signature policy goal.”

 The inability of a troubled project to meet all three goals simultaneously, almost feels like the complex systems equivalent of the Heisenberg uncertainty principle; that is, it’s impossible to simultaneously determine the exact position and velocity of an atomic particle with any great degree of accuracy no matter how good your measurement tools are. While clearly not a scientific principle, but a set of guidelines based on decades of experience, there seem to be intrinsic limits to our ability to fix troubled IT projects no matter how hard we try.

 In The Mythical Man-Month, noted computer scientist and software engineer Fred Brooks introduced one of the most important concepts in complex IT systems: adding manpower to a late software project makes it later. Brooks’ Law as his concept became known, remains as true today as when it was first formulated almost 40 years ago.

 Over the years, we have learned that there are limits to our ability to pre-plan complex IT projects in advance. You need a good design, architecture and overall project plan, but you also need the flexibility to learn as you go and make trade-offs as appropriate. Most such projects are therefore released in stages, with alpha and beta phases that start testing the system with a select and relatively small number of users. Such early testing uncovers not only software bugs, but also design flaws that users have trouble with.

 Another important lesson is that all parties involved in a complex, high-risk project must have a good working relationship. All available information on the status of the project should be shared, so there are few last-minute surprises. Tradeoff decisions and project adjustments should involve all key members of the team. Behind most seriously troubled projects lies not only a gulf between planning and reality, but a lack of the close collaboration and overall good will necessary to make the project succeed.

 It’s hard to imagine a more politically contentious project than the ACA.  The administration was worried that any glitches uncovered while testing the system as part of the usual staged release cycle would give further ammunition to those trying to kill the ACA altogether. They may have felt that slipping deadlines and reducing features prior to the October 1 launch was not politically feasible, and that they therefore had no choice but to launch anyway and hope for the best.  Did they make the right decisions?  We’ll find out in the fullness of time.

 Irving Wladawsky-Berger is a former vice-president of technical strategy and innovation at IBM. He is a strategic advisor to Citigroup and is a regular contributor to CIO Journal.


Healthcare’s New Big Idea

14 Oct


Once upon a time in the American healthcare system, big data was an unknown idea. Recognizing that healthcare costs rose unmanageably and healthcare quality varied dramatically without clear explanation, Congress introduced Managed Care with the hope that relying upon a for-profit business model would make the system more competitive, more comprehensive, and more effective. Now, over thirty years later, it appears that new changes in American healthcare will position “big data” as the driver of effectiveness and competitiveness. Here are a few reasons why.

When thinking about the government policy that will make big data essential in the new healthcare system, three main pieces of legislation come to mind – the obvious heavyweight of the group being the Affordable Care Act (ACA). By now, most know that by passing the ACA into law, the federal government shifts America away from a volume-based system of care (in which doctors and hospitals make money based on how many tests they run and treatments they try) to a value-based system in which doctors and hospitals receive rewards according to the value created for patients. However few know that in order to actualize this value-based system, the ACA directly implicates big data at federal and state levels of healthcare. For example, the ACA authorizes the Department of Health and Human Services (HHS) to release decades of stored data and make it usable, searchable, and ultimately analyzable by the health sector as a whole to promote transparency in markets for healthcare and health insurance. Here, the driver of transparency, and thus competitiveness and effectiveness, is clearly big data.

In other examples, the ACA uses language that endorses, if not mandates, big data use throughout the system. The ACA not only explicitly authorizes the Center for Disease Control (CDC) to “provide grants to states and large local health departments” to conduct “evidence-based” interventions, it creates a technical assistance programs to diffuse “evidence-based therapies” throughout the healthcare environment. Note that in the medical community, “evidence-based medicine” means making treatment decisions for individual patients based on data of the best scientific evidence available, rendering the use of this relatively new term an endorsement of big data in healthcare treatment. These pieces of evidence – in the form of direct references to big data at the federal level, state level, and patient level – strongly support the conclusion that the ACA creates a new system reliant upon big data for efficiency and competitiveness.

The remaining pieces of legislation further signal big data as the new lifeblood of the American healthcare environment. In 2009, the Open Government Directive, in combination with the Health Data Initiative implemented by HHS, called for agencies like the Food and Drug Administration (FDA), Center for Medicare & Medicaid Services (CMS), and CDC to liberate decades of data. The Health Information Technology for Economic and Clinical Health Act (HITECH), part of the 2009 American Recovery and Reinvestment Act, authorized over $39 billion in incentive payments for providers to use electronic medical records, with the goal of driving adoption up to 85% by 2019. Finally, to facilitate the exchange of information and accelerate the adoption of data reliance in the new health environment, CMS created the Office of Information Products and Data Analytics to oversee numerous databases and collaborate with the private sector. Among other functions, this office will oversee the over $550 million spent by HHS to create data clearinghouses – run by states – that will consolidate data from providers within the given state. All of this legislation, which essentially produces a giant slot for a big data peg to fill, paves the way for a new healthcare environment reliant upon rapid sharing, analysis, and synthesis of large quantities of community and national health data.

Now at this point, nearly four years after legislation supposedly opened the floodgates of big healthcare data to the private sector, the reader must wonder why more private sector companies haven’t taken advantage of an obvious market opportunity. The answer is: actually, a few first movers have.

Blue Cross / Blue Shield of California, working together with a company called Nant Health, has created an integrated technology system that allows hospitals, HMOs, and doctors to deliver evidence-based care to patients under their jurisdiction. This system catalyzes performance improvement, and thus revenue-generating value creation, across the system. The use of big data has also allowed some first movers to innovate and generate applications reliant upon newly liberated data. A company called Asthmapolis created a GPS-enabled tracker that monitors inhaler usage by asthmatics, directs the data into a central database used to detect macro-level trends, and merges the data with CDC data pertaining to known asthma triggers. These few cases illustrate that private sector engagement in this new market opportunity remains new, and diverse, and far from delimited.

The ACA has moved into its execution phase, and the introduction of the big data idea poses new and interesting challenges to how the American Healthcare system will evolve. Some challenges will bring about positive change, such as identification or clear opportunities for preventive care. Other challenges will bring negative change, such as the adverse effects transparency will likely have on certain patient groups. Regardless, it looks like big data is here to stay.

Good Perspective on Analytics

11 Feb

Interesting post on healthcare fraud – a couple of good points – traditional view of predictive analytics assumes you know what you are predicting – Fraud is adaptive – it changes. As a result being predictive requires a more diverse approach that involves a range of analytical approaches.

%d bloggers like this: