Bringing IT at the FDA into the 21st Century

This past week, the Alliance had the pleasure of meeting with Dr. Eric Perakslis, the Chief Information Officer (CIO) and Chief Scientist (Informatics) at FDA. Please click here to see the slides he provided that review some of the functions, challenges, and accomplishments of his office during the brief (less than 1 year) period that he has been in the post. An updated version of this deck will probably be available in mid-September, after he and Commissioner Hamburg present some of their more recent accomplishments and perspectives at a conference in New England.Dr. Perakslis’ background is quite different than most senior FDA leadership, in that he comes with a proven track record in industry (most recently at J&J) and is a late-stage kidney cancer survivor. He has a commitment to stay 3 to 5 years and help transform FDA into a more modern scientific decision-making agency ... which starts with the information and data needed to make better decisions. He expects that this will result in greater efficiencies (lowering costs, as well as providing monies for re-investment), as well as a dramatically greater value to the American public from the services that the agency provides and from more informed decision-making.Some of Dr. Perakslis’ points I thought were the most interesting:

  • Government costs for relatively routine IT functions (e.g., maintenance of e-mail accounts) is as much as 100 times greater on a per-account basis than the private sector. Because of the security and other special requirements of government, the entire difference cannot be realized in savings -- but it is quite achievable to reduce such cost (over time) to 50% or less of their current levels.

  • A significant portion of the IT and bioinformatics changes that will improve FDA performance are not dependent on new or improved technology. Rather, off-the-shelf and modestly modified solutions are already available. The problem of upgrading is far more often a matter of "sociology and not technology." An example was the use of computers during food inspections -- the key having been to identify computer/computer specs that would allow a computer to be hosed down after an egg inspection. The efficiency of a computer versus paper documentation during such an inspection are enormous.

  • The monies to pay for IT and bioinformatics are generated from the Center budgets, plus NCTR and the Office of the Commissioner. This means the "budget" for IT is not found in one place, but comes from every part of FDA (including from user fees where/when appropriate).

  • He counts about a half-billion dollars spent per year on maintaining, improving, and performing IT and bioinformatics activities. Even though these are "costs" and vulnerable to any cuts, including sequestration, he felt that the cost-savings/value-added proposition was sufficiently great that he would be able to continue to draw needed personnel and resources for his efforts.

  • One of his structural accomplishments has been to triage requests into pathways most likely to achieve results in a reasonable time and cost. For example, he has staff that concentrate solely on projects that can be accomplished in 30 days. The intent is always to create a priority for initiatives that will have the most impact -- but sometimes a dozen smaller problems solved quickly are as important as a project with a much longer time to resolution.

  • Automation of functions and cloud computing are two of the ways in which technology cycles can be brought down to 18 months or less -- more typical for industry -- but not the norm for much of government, where technology cycles often run 6 to 8 years and are obsolete by the time they are ready.

I was particularly impressed by his answer to the issues surrounding data standards. He pointed out that a lot of the data for hypothesis-generation and general decision-making already exist -- that sound approximations can be made without forcing data consistency or standards back onto databases that have already been collected using different methodologies and priorities. Once that process is started (rather than deferred, waiting for standards to be created or compatible databases to be reconstructed), it is often much easier to get the agreement and cooperation for later data collection to coalesce around a consistent set of standards (as the level of need shifts from "about right is good enough" to "precision is needed for final decisions and approvals.")I hope this provides a good overview of what Dr. Perakslis covered. (I am sure he will tell me if I didn’t.)  I have more notes and will come back to this topic sometime this fall -- it was an immensely hopeful meeting and I want to be sure that this is conveyed to the Alliance membership.

Note: This analysis and commentary is written by Steven Grossman, the Deputy Executive Director of the Alliance for a Stronger FDA.

Previous
Previous

Sequestration Potentially to INCLUDE Industry User Fees

Next
Next

Update on FY 13 Continuing Resolution and Congressional Deficit Negotiations and more