Thursday, December 20, 2012

Using IP Strategies to Break Down Barriers to Progress

Intellectual property (IP) protection is critical to driving investment in new products that can help patients as well as create economic value. Some maintain that IP protection can make scientific collaboration, technology transfer, and commercialization more challenging. At Partnering for Cures, five leaders in biotechnology, science, and intellectual property joined together to discuss how IP could aid the pace of innovation and technology to get to patients who most need it.

Moderator Maria Freire of the Foundation for the NIH began by pointing out that some people “go into panic attacks when having to deal with intellectual property … it seems like a big hurdle between basic science research done in universities or companies and getting it to the market.” However, this is not a new issue. The biggest change is that, within the last 10 years, new players have emerged: venture philanthropies. Freire referenced FasterCures’ newest publication, Unlocking Intellectual Property: Principles for Responsible Negotiation, which serves as a useful tool for all parties in biomedical research, in particular these new nonprofit disease groups and philanthropists.

In addition to the arrival of new players, Stephen Johnson of Kirkland & Ellis LLP and One Mind for Research emphasized that the IP landscape has changed due to the arrival of new technology. In the past, IP focus was on patents, but the “focus has moved away from patents and toward data,” he said. He used to see resistance to sharing data among pharmaceutical and biotechnology companies. Now, “there is acceptance of the pre-competitive space,” and companies increasingly embrace the opportunity to work together. “The future of creative collaborations will be balancing openness with incentives,” he suggested. Freire summarized that companies are more willing to share all data in the beginning, but once there is true innovation, they will then put protection around it. Johnson agreed with her assessment and advocated for processes that promote transparency. He said that one of the reasons that companies would be reluctant to share data is that they are worried that someone else who has access to the data is smarter than them.

Stephen Friend of Sage Bionetworks gave an example of another hindrance, stating that “many companies don’t feel like they can share what they’re talking about … and it’s hard to get things financed without a clear IP strategy. Many worry that someone else could come in, grab the idea, and take advantage of them.” He supported extending the pre-competitive space: “Too many post-docs think they are working on the next billion-dollar drug long before it is.” He cited the successful collaboration of Merck, Pfizer, and Lilly, who generated data in China and agreed that they would share the data among themselves for one year and then make the data publicly available.

Teresa Stanek Rea said that the U.S. Patent and Trademark Office also has a cooperative approach: “We are trying to collaborate with companies to find out what they need to do their job.” Like industry, the Patent and Trademark Office is trying to be more precise and more efficient in what they do and sees itself as the innovation agency in the U.S. government. “We are an agency in the throes of change, just as you are,” she said. Rea noted that the America Invents Act, puts forth “great provisions that help the user community because it takes a second look at issued patents, and whether the patent should have been granted.” Rea believes that the act should minimize litigation and not inhibit research.

Steven Tregay of FORMA Therapeutics brought the conversation back to venture philanthropies and emphasized the importance of being “cognizant of whether the patent that covers the product can actually be translated into treatment.” The key is helping people understand the value for society versus owning one possible combination that may be pertinent or may never be turned into a drug. He cited the success of the CoMMpass project of the Multiple Myeloma Research Foundation, a collaborative study that brought together a network to decide how IP will be shared and who has access to it. Tregay has also worked with the Leukemia & Lymphoma Society (LLS), which aggregates data from many organizations at once. LLS is concerned with creating value to patients and creating a path forward, and it shouldn’t be burdened with the “nightmare of bringing universities together,” he said. “That is the power of these disease foundations – that they are really laser sharp in their approach,” said Freire. “They don’t want to fix the world; they want to fix something for that indication. The traditional paradigm may not necessarily apply.”

Friend agreed that traditional ways of doing business may stymie dialogue and interaction. “We are at a spectacular time where we have tools, new approaches in order to innovate, and yet the way we have structured our incentives and our rewards around sharing, around who is getting credit, et cetera, is basically independent to that.” He cited the success of CommonMind, a public-private pre-competitive consortium that generates and analyzes large-scale genomic data from human subjects with neuropsychiatric disease and makes the data and results broadly available to the public. Friend said that university tech-transfer offices had a hard time at first agreeing to share data being generated, but that “the data required to build the models needed to develop the drugs had to be accessible in order to innovate.” The parties created collaboration agreements that allowed investigator data to be shared with others and not kept to themselves. “We must come up with incentives and rewards that allow the data to get out there,” he said.

Joseph DeSimone of the Frank Hawkins Kenan Institute of Private Enterprise at the University of North Carolina agreed that “partnerships are what work the best, and new connections should always be made,” but cautioned that “without really clear IP, it’s getting increasingly hard to get things financed. Having a really clear IP strategy and path to market is going to be increasingly important.” His university has a conflict of interest committee that meets with him and his students who start companies. He believes that transparency of partnerships makes it more successful: “If you are open to that kind of openness, it can be powerful to enable these kinds of connections.”

In closing, the panelists discussed the future of intellectual property protection with regards to innovation. There was a consensus that patenting had gotten more difficult, and Freire ended by saying “Let’s not rediscover wheels. If you can put something in a box, it’s a lot easier. If you want to think outside the box, just make sure that what you have already doesn’t fit in an existing box.”

Related resource:
Unlocking Intellectual Property: Principles for Responsible Negotiation

Monday, December 17, 2012

Imagining improved models of technology transfer

There is a growing realization that the traditional model of technology transfer at universities isn't entirely keeping up with the growing complexity and changing landscape of biomedical research and development. Though agreement structures are still evolving and funding sources are changing, a growing appetite for earlier information sharing and partnering have led to new and creative approaches to collaboration.

At Partnering for Cures, five experts in the field of technology transfer and university commercialization discussed new approaches to innovation and collaboration. Moderator Lou DeGennaro of the Leukemia & Lymphoma Society began the discussion by asking what recommendations the experts had for dealing with the “growing pains around commercialization” and the consequent tension universities have been experiencing with tech transfer.

Louis Berneman of Osage University Partners was quick to designate himself as a “critical lover” of tech transfer and argued that the metric for measuring success should not be how many university start-ups are in existence, but rather how many went on to “induce further investment.” Berneman added that licensing revenue should not be the main concern.

Other panelists agreed that tech transfer needed to be viewed as a comprehensive process. Chris Coburn of Cleveland Clinic Innovations noted that “[Tech transfer] is all about execution and talent—companies can’t just be good at negotiating deals, they must be attentive to details.” Jodi Black of the National Heart Lung and Blood Institute at the National Institutes of Health added that innovators and those in tech transfer offices needed to work together in order to attain the necessary intellectual property expertise. She explained that “these ecosystems should be developed in a way that rewards culture change and values commercializing innovations.”

Regis Kelly of the California Institute for Quantitative Biosciences reminded the audience of the urgency of addressing tech transfer challenges when he stated that he was on a mission, mainly inspired by his wife who has Alzheimer’s. Kelly said that the “rate of getting ideas out of the university and into the marketplace to help people has to be accelerated” and that “tech transfer offices are too focused on faculty” even though the best people to start companies are post-docs, those with “fire in the belly.”

In Robert Urban of Johnson & Johnson’s opinion, tech transfer relies on people, and bringing people together gives technology the opportunity to realize itself. He offered an example from his time at MIT, where he was recruited to help launch an interdisciplinary institute to create a new way to tackle oncology, a field that is “buried alive in data.” Urban asserted that “biologists needed to be put in an area where they could be supported by tech people” so that they are close enough to interact and collaborate. The institute includes biologists, technologists, and mathematicians. In five years, this collection of individuals has created 17 companies, which have raised $300 million in capital.

In closing, the panelists agreed that better project management is a key catalyst to advancing this field. Kelly and Coburn both stated that there was a shortage of information, and Coburn suggested profiling the 75 largest academic medical centers so there would be a central database of information about their medical innovations. Berneman emphasized the importance of not focusing on licensing revenue, but instead on future investment, while Black stated that regulatory and business expertise needed to be “in-house” at research institutions and companies for better decisions to be made. Urban’s advice was to encourage transparency and find personnel that were able to “adapt to the journey [of tech transfer],” even if that means starting the process over from the beginning, if that is what is needed. DeGennaro summarized by saying, “It’s not just about how many new grants there are or how much money has been accumulated, but rather a need for project management [of tech transfer] in a way that hasn’t been thought about yet in an academic setting.”

The evolving role of payers in the R&D ecosystem

Payers, including the federal Centers for Medicare and Medicaid Services (CMS), are taking an increasingly visible seat at the decision-making R&D table, long before they get asked to pick up the tab. They're mining their vast databases and partnering with pharmaceutical companies to help them develop personalized medicines faster. In a Partnering for Cures panel discussion moderated by Ceci Connolly of PricewaterhouseCoopers, experts discussed the evolving role of payers in the R&D ecosystem, focusing on meaningful outcomes, the role of patients, reimbursement, the potential for the Patient-Centered Outcomes Research Institute (PCORI), and how to reduce costs.

Panelists began the discussion by providing insight into the term “value.” Shari Ling of CMS pointed out that in order for there to be value in today’s healthcare system, “outcomes must be meaningful.” Kim Popovits of Genomic Health, Inc. agreed and highlighted the importance of turning data into something actionable. Freda Lewis-Hall of Pfizer brought up a theme heard throughout the Partnering for Cures meeting, that “at the end of the day, outcomes for patients are our currency. Fundamentally, it is all about whether you have significantly affected, improved, and impacted the outcome for the patient that is going to receive the treatment that you are developing.”

The conversation then moved toward the role evidence plays in the R&D spectrum. Patient outcomes dominated the conversation. “Perhaps a meaningful outcome would be the patient’s ability to function,” said Ling. “These are outcomes that are important to patients, but we have no data source so we are a little constrained.” She commented that some of the outcomes measured may not be the most meaningful to the patients. Ling also noted, “Decisions for coverage are predicated on what’s reasonable and necessary to achieve better health outcomes.” Lewis-Hall saw potential for organizations such as PCORI to create tools that can harmonize not only the methodology for data collection and analysis but also a lexicon definition of values so that standards may be developed.

Other panelists focused more on the health economics of the payers’ role. Mark Tykocinski of Jefferson Medical College acknowledged the enormous service PCORI is providing with coding, but reminded the audience and speakers that we need to be aware of where the financial cutoffs are and perhaps question if evidence-based care is worth the costs. Tykocinski spoke of the dangers of continuing down the same path, arguing that “we are seeing a hyper-segmentation of disease categories. With different therapeutics applied to them, the unit costs of individual therapeutics will go through the roof.”

On reimbursement, Popovits called for new innovative approaches to rewarding for quality and revamping how disease is treated, with a greater focus on diagnostics. She argued that this year alone, $80 billion will be spent on cancer therapeutics, “a disease we still do not fully understand.” With an average efficacy of a cancer therapeutic at only 25 percent, this constitutes “a $60 billion waste.”

In closing, Connolly asked each panelist to identify one priority for moving forward. Russell Teagarden, formerly of Medco Health Solutions, Inc., said that there needs to be a “concentrated effort on social learning so we learn [that] … we all have a part in healthcare.” Popovits cited value-based reimbursement and incentives to get there. Ling wished for more effective communications across parties in the R&D ecosystem. Lewis-Hall wanted to see active participation of the payer community in helping find unmet needs and ultimately the implementation of new diagnostic devices, therapies, and treatment paradigms. Tykocinski suggested that the role of physicians will be totally different in the next few years, and it is time to put art back into medicine.

This panel aptly portrayed the complex nature of the payer world. Though there was disagreement among panelists on specific solutions and key points when thinking about coverage, they all agreed on the importance of developing standards for value decisions, and that collaboration among payers, agencies, and companies in the drug development stages was key. In today's R&D ecosystem, payers have the drive to change the current structure of payment and reimbursement and need support on how best to take action.

Advancing personalized medicine through molecular diagnostics

With the mapping of the human genome and the revolution in molecular biology technologies, we are increasingly capable of monitoring human biology and disease in very sophisticated ways. Our ability to measure biological processes and indicators is now essentially "turning on the lights" on our ability to understand how life works. But tests using molecular methods, including molecular diagnostics, are the rate-limiting step for the full promise of personalized medicine to be realized. Challenges include the development of viable business models and the lack of reimbursement policies that recognize the value of companion diagnostics. In addition, laboratory-developed tests are being used extensively, but not always with proper validation, and the Food and Drug Administration (FDA) is struggling to provide regulatory guidelines that address this issue without stifling innovation.

A panel of diverse experts, moderated by Wendy Selig of the Melanoma Research Alliance, came together at Partnering for Cures to discuss the realities and opportunities in the world of molecular diagnostics. Several themes emerged during the discussion, including:
  • Regulatory gaps – for example, laboratory developed tests – are creating uncertainty, particularly when it comes to reimbursement.
  • Managing next-generation data in a productive way will be key to optimizing the potential of molecular diagnostics.
  • Payers are fundamentally optimistic about personalized medicine but cynical because of experience.
  • Though we have a lot of information, we don’t always know what it means.
David Parkinson of New Enterprise Associates, who spoke from the perspective of a clinician turned drug developer turned investor, pointed out that there are major disconnects in the world of molecular diagnostics – from regulatory expectations to payment incentives – that impede our ability to take advantage of the technology. “Until these disconnects are brought into equipoise, the question remains: who is going to develop these highly predictive tests, these biological characterizations of patients that allow clinicians to make personalized treatment decisions based on an individual patient’s tumor?”

Michael Pellini of Foundation Medicine underscored this point by highlighting the practical challenges that need to be addressed. “If we continue to think about each molecular test for a targeted therapy as a single test in which the clinician needs to be brilliant enough to single out the markers that he or she will test for in that patient and be right, that is challenge number one,” he explained. Pellini also pointed to other practical challenges, including the question of whether there will be enough tissue from a biopsy to conduct each test and how push-back from payers on reimbursement could impact use.

Jeffrey Trent of The Translational Genomics Research Institute (TGen) added that although technology has led us to a place where we can make informed treatment decisions based on molecular genetics, it is important that we do not overlook patients who do not have genetic alterations that can be specifically targeted by drugs. This is a challenge that TGen is addressing by conducting clinical trials in patient populations who cannot be stratified and treated based on genetic lesions.

Alberto Gutierrez of the FDA discussed some of the challenges with respect to the regulatory environment. He explained that the regulatory environment in this area is not clear and that a lot of the molecular testing that goes into diagnostics are laboratory-developed tests, which are not regulated by the FDA. Despite this gap, Gutierrez said that the FDA is more interested in clinical validity than clinical utility when it comes to molecular diagnostics. He explained the importance of ensuring that these tests provide physicians with meaningful data that they can understand and use confidently when making treatment decisions.

Sean Tunis of the Center for Medical Technology Policy provided the payer perspective. “Payers are fundamentally optimistic about personalized medicine, but cynical by experience because they know that they aren’t going to get exactly what they were promised in terms of savings,” he said. Tunis explained that payers are most concerned that the smaller populations of patients as a result of stratification by genetic subtype will lead to higher cost due to low demand for each individual test. He also said that payers are worried that in cases where risk outcome is low, clinicians may decide to continue with a more aggressive treatment path. In these cases, not only is the test reimbursed, but payers are also paying for a therapy that may not be necessary for the patient.

In conclusion, Selig asked the panelists what they think is the most important change that we need to pursue right away. Everyone agreed that first and foremost, we need not become overly engulfed in exploratory technology, and instead continue to make sure that we can treat the patients that need treatment today. Parkinson and Tunis highlighted that important changes need to be made with respect to reimbursement for diagnostic tests. As Tunis put it, “Whenever we get the regulatory framework figured out, we then need to tie it to reimbursement.”

Friday, December 14, 2012

Basic Science: The 98% We Still Don’t Know

There is a growing sense not only in academia but also in industry that in many therapeutic areas we simply don't know enough about the basic biology of disease to effectively pursue treatments for them. Drug development in areas such as HIV/AIDS and Alzheimer's have been brought up short by a sense that we may be shooting in the dark at unclear targets – that we're wasting ammunition, so to speak. At a Partnering for Cures panel, we took a step back to look at the fundamental building blocks of our R&D enterprise to see what questions remain unanswered and why.

Moderator Cecilia Arradaza of FasterCures opened the discussion by noting that too many breakthroughs go far enough along in the development process but don’t see the light of day. Turning to panelists representing key sectors of the medical research enterprise, she focused the discussion on identifying tools, technologies, or approaches that will allow us to get to some of these vital answers.

“We know much less than we really need to know about almost every single disease, from rare diseases to very common diseases, because we don’t know enough about what causes diseases and also about heterogeneity of expression,” explained William Chin of Harvard Medical School. Chin reinforced the need for a systems approach to understanding disease. Thomas Insel of the National Institute of Mental Health agreed, and pointed out that “there are lots of reasons why studies fail… there is often unpublished data that could lead others to know that what they are doing is a dead end.” Insel said that forums such as http://clinicaltrials.gov were ways of disseminating valuable information, but that competition may inhibit some scientists from publishing critical data.

Matthias von Herrath of the Type 1 Diabetes Research and Development Center at Novo Nordisk elaborated on the issue: “It is important to work together and break down silos and sequestered areas. Where we fall short is in understanding negative data. This is a fundamental problem in both academia and industry.” Von Herrath further emphasized the value of sharing failures and realizing that difficulties arise when there are only incentives for successes.

Brian Mansfield of the Foundation Fighting Blindness described his perspective on animal models, which are a critical link in the translation of basic science to clinical practice, but are not predictive for all diseases. He cited the example of mouse models, which are easy to breed and cost-effective, but can be very different from humans. Mansfield also said “there are lots of constraints on gene therapy.” For example, many people think they can get gene therapy once the gene that is causing their illness has been identified, but that is not always the case.

Panelists agreed that an overwhelming list of questions remains unanswered. Insel noted that we may only actually know about 2 percent of what we should – an optimistic view, according to other panelists. To improve upon this, facilitating a culture change was necessary, they said. Mansfield argued that the basic science culture needs to be changed in a way that would definitively help patients, and suggested that grant-awarding organizations should make it mandatory to publish both positive and negative data as a condition of accepting the grant. Both Insel and Chin agreed that creating teams of individuals with several different perspectives would help advancement in the field. Insel added that collaboration is key, but ultimately, most discoveries are driven because of one individual investigator taking the lead. Von Herrath pushed for the necessity of “tangible incentives” and a cooperative culture that can accelerate translation of basic knowledge into effective therapies.

In all, the panel noted the importance of striking the balance of investing in basic science that allows us to understand the biology of disease while also creating an environment that allows serendipitous paths that lead to new therapeutics.

Thursday, December 13, 2012

Catalyzing drug development for the team sport of translational science

Today there are many efforts under way to create drug development tools – from therapeutic area data standards to preclinical safety biomarkers to patient-reported outcomes instruments – needed by the field involving the pre-competitive sharing of data and expertise and leading to the development of standards that are then qualified by the Food and Drug Administration (FDA) and other regulatory bodies.

At this year’s Partnering for Cures meeting, a panel of experts discussed the role of rules, tools, and data pools in the team sport that is translational science. The panel agreed that the efficiency of the drug development process needs to be improved, and the key to doing this will be to reform the system so that the rules are the same for all players involved.

Carolyn Compton discussed the role of the Critical Path Institute (C-Path) in this reformative process. Compton explained that one of the primary goals of C-Path is to form consortia around standards creation to streamline the rules so that they are applicable to all sponsors submitting new drug applications to the FDA. The development of these standards, she noted, will increase the workflow efficiency of both the sponsors and the FDA.

Eric Perakslis of FDA reiterated the need for this type of streamlining. Perakslis explained that currently the system by which submitted applications are checked for completion can take months. In addition, the task of reviewing data that have been collected and measured in numerous ways significantly hinders the workflow. He argued that the creation of thoughtful and meaningful data standards would go a long way in streamlining the review process.

George Vradenburg of USAgainstAlzheimer's pointed out that the value to patients, taxpayers, and industry of modestly compressing drug pipelines will result in billions of dollars in savings; however, he said that we spend too much time discussing the “trivial underbrush” and that we really need to set priorities that will create real value and “aim at those like a laser light.” Vradenburg also highlighted that the path to alleviate some of the process congestion will require a “focus on some big implementation steps with clear action goals that can be taken on jointly by the team,” which includes government agencies, industry, research communities, and patient advocacy organizations.

Some of the major implementation changes that Vradenburg referred to are evident in precompetitive research initiatives among leading biopharmaceutical companies. Marc Bonnefoi of Sanofi US explained that TransCelerate BioPharma and Project Data Sphere, initiated by the CEO Roundtable on Cancer’s Life Sciences Consortium, are examples of successful precompetitive research initiatives among pharmaceutical companies where data are shared with the goal of using data more efficiently to improve the quality of clinical studies and accelerate drug development. Bonnefoi stressed, however, that the success of these precompetitive initiatives is highly dependent on contribution from each organization’s leadership team to support the aims of the initiatives.

Dana Ball of T1D Exchange redirected the discussion toward asking the right questions of the data that we have: “Data for the sake of having data is not helpful. We have to think … down the line to ask what problems we are trying to solve with this information; all of this will [determine] the tools that we will need to build solutions to these problems and the rules [of using] the data pools.”

Panelists agreed that the primary limitation now is not the data; but rather the handling and interpreting of the data. In order to address this problem, there needs to be a major investment in infrastructure. As Compton said, “[This type of investment] is not sexy, but it is absolutely necessary.” We need this infrastructure to make the data powerful enough to turn it into medical solutions for patients.

Tuesday, December 11, 2012

The election’s impact on medical research

The Partnering for Cures panel “Election 2012: What Does it Mean for Medical Research?,” offered an inside look into the implications of the November election outcomes for biomedical research and innovation. In an animated discussion, panelists covered some of the hottest issues, ranging from funding for the Food and Drug Administration (FDA) and National Institutes of Health (NIH) to sequestration to the importance of maintaining U.S. competitiveness in healthcare innovation.

Neera Tanden of the Center for American Progress emphasized the need for Americans to maintain their competitive, innovative, and economic edge in the healthcare sector. In recognizing that “the president is extremely mindful of NIH’s role and its role in America’s competitiveness,” Tanden identified not only the need to solve the fiscal crisis, which has created a “high level of uncertainty that is unhelpful,” but also the strong connection between the budget and long-term American competitiveness.

Cheryl Jaeger of House Majority Leader Eric Cantor’s office drew an optimistic picture of possible bipartisan actions that could take place with the new Congress that takes office in January. Jaeger argued that while opinions may differ on entitlement spending, “there really is bipartisan agreement on the importance of funding NIH and FDA. If there is one area where both sides of the aisle come together, it is on medical research.” The challenge lies not in realizing the crucial necessity of these organizations for medical research, but in understanding how “we can grow the economy and how to make sure that individuals recognize that these programs are the highest priority within the funding infrastructure.”

Wendell Primus of House Minority Leader Nancy Pelosi’s office focused largely on the government’s need to raise more revenue in general and to take care of the large wave of baby boomers that are set to retire in the next decade. “We are not going to take Medicaid and Medicare spending down enough to increase NIH budgets – it ain’t going to happen. If we don’t raise revenue, the NIH budget will continue to go down in real terms,” he argued. For Primus, Obama’s re-election has meant the chance to raise government revenue and bring the NIH budget back to a place where it is not losing value due to stagnant funding levels and a loss of purchasing power over time.

Scott Gottlieb of the American Enterprise Institute pointed out that company formation has been limited, VCs are consolidating, and funds for startups are beginning to dry out. Gottlieb argued that the NIH is only part of the ecosystem when it comes to innovation, and there is a need to “look at what’s happening on the policy front and look at the other components: the capital formation, what’s happening with reimbursement, immigration policy, [as] these are exceedingly important to biomedical innovation.”

Moderator Greg Simon of Poliwogg posed a final question to the panelists about sequestration and if a substantial deal will be made before the “fiscal cliff” deadline. The verdict was largely optimistic, with Jaeger and Tanden believing a solution was “hopeful,” Primus stating that a solution will be reached “because we have to,” and Gottlieb suggesting that most of the major issues will likely be punted to the next Congress.

Friday, December 7, 2012

Improving patient outcomes through technology

We hear a lot these days about how data sharing and collaboration have great potential to reduce the cost of healthcare and improve outcomes for patients – but who is actually doing it? What is at stake, what are the barriers, and what are potentially scalable solutions?

A Partnering for Cures panel focused on a case example of a patient-centric demonstration project within the ImproveCareNow Network, which includes 44 pediatric gastrointestinal care centers and more than 13,000 patients, and has succeeded in improving remission rates by sharing best practices among care teams across sites. Moderator Dominique Pahud of the Ewing Marion Kauffman Foundation introduced the goals, design, and participants leading the technology intervention.

The project features a collaboration among entrepreneurs, patients, physicians, designers, and researchers to provide a backbone for pediatric gastrointestinal care centers that delivers a full integration of technology-based solutions, including electronic health records, passive monitoring, and patient-recorded outcomes. The collaboration is employing a suite of technology interventions at eight different centers within the ImproveCareNow Network, representing 2,400 patients.

Each panelist discussed his or her contributions to the collaboration, with Richard Colletti of the University of Vermont School of Medicine introducing the operations and goals. Colletti highlighted that the remission rate of pediatric gastrointestinal patients increased from 50 percent to 75 percent since the establishment of the network in 2007, noting that “if we had done this with a new drug, it would be a drug that everyone would want to use.”

Peter Margolis of the University of Cincinnati School of Medicine discussed the founding of the network and the establishment of the revenue model, which relies on care centers paying to participate. He also highlighted the collaborative learning system of the network, which is based on a “steal shamelessly, share seamlessly” ethos.

Designers that were involved in creating the technology backbone also spoke on the panel, including John Chaffins of Lybba. Chaffins described the method that Lybba used to design apps, which was based on avatars that were used to solidify the concept of a “patient,” which he said “became a tool for thinking about the kind of design choices you are making … and establishing a common language” in the collaboration.

Anmol Madan of Ginger.io addressed his biggest challenges in being a health-focused entrepreneur, such as getting access to healthcare providers in order to understand their technology needs. He also commented that “the biggest value [of participating in the collaboration] was getting the validation we need to go from an interesting technology to a commercial product.”

Finally, John Wilbanks of the Ewing Marion Kauffman Foundation returned the conversation to the 10,000-foot level to discuss the challenges and opportunities for collaboration among multiple players in the biomedical space. As the developer of the intellectual property framework for the collaboration, he spoke of the need for an organizational structure that makes collaboration possible, including a “framework that allows for the creation of both private and public value.”

The panelists highlighted a range of organizational incentives for joining the project, and invited the audience to join the movement toward open, value-adding collaboration.

Solving research problems through social media

As more scientists take their work to the Web and the cloud – publishing papers in open-access journals, recruiting trial participants through Facebook, and crowd-sourcing investigations – the availability and traction of social media tools to support medical research have grown. Facilitating everything from the establishment of "virtual lab space" to the crowd-funding of promising science and scientists, these networks are gaining traction in an increasingly collaborative research environment.

A Partnering for Cures panel featured leaders from diverse organizations that are each using social media in unique ways to create new data platforms and solve research problems. In discussing the impact of social media on research norms, they agreed that social media represents a powerful new source of data, but in order to properly harness the data, the research community needs to become more comfortable collecting and probing the data from perspectives that may be unfamiliar.

Jon Fredrickson of Innocentive anecdotally pointed out that the community needs to develop an appreciation for “different angles of attack” to solve problems. Frederickson explained that in the world of crowd-sourcing investigations, a major hurdle for organizations when a member of the “crowd” solves the problem is actually accepting the solution: “The personal issue that the [organization] has to face is when I get the answer and that person doesn’t look like me, wasn’t trained like me, and doesn’t have gravitas that I have, can I accept the fact that that answer really does solve my problem.”

Moderator Matthew Herper of Forbes Magazine pointed out that while social media can generate a lot of data, the quality of the data can sometimes be called into question.

Sally Okun of PatientsLikeMe explained that there is a story behind each piece of data, whether it is structured or unstructured. PatientsLikeMe uses a clinical networking platform to more accurately assess and report patient outcomes. Much of the data that are collected is unstructured, as their system allows patients to relay their experiences in their own words. The data are then curated and coded into clinical terms, and validated by sharing the clinical interpretation with patients to ensure accuracy. Okun explained that they have found this type of assessment more accurate than the conventional method of clinically assessing patient-reported outcomes through questionnaires.

Elizabeth Iorns explained how her organization, Science Exchange, creates an online marketplace for scientific facilities and services provided by universities and companies around the world, which otherwise may prove to be difficult to find or access. Using a fee for service model to create an online exchange for science experiments, Science Exchange provides a new way to ensure research reproducibility and validation of findings. Iorns noted that nearly 70 percent of data produced by academic labs cannot be reproduced independently, which may be a key reason that many drugs fail. One of the reasons that this problem exists, according to Iorns, is that the incentive system is geared toward quantity, not quality: “There are not stringent checks and balances in place to ensure that the data have been validated and are identified as correct.” Iorns explained that the quality of data generated from Science Exchange is generally higher, not only because of the validation process, but also because single investigator bias is removed, and it creates an overall auditing process. “These audited results are more likely to be the ones that lead to a cure,” she stated.

All panelists agreed that the norms as they relate to social media and research are indeed changing; however, the rate of change is not evenly distributed. John Wilbanks of the Kauffman Foundation said, “The rate at which norms evolve compared to the rate at which technology develops, which is so much faster, means that we are going to have a weird 5-10 years ahead of us, and we don’t have a culture right now that is tolerant of weird interstitial periods where we have capacity and not norms.” Wilbanks explained that the immediate danger to be most aware of is knee-jerk regulation and legislation in response to these changes. He warned that if we do not plan strategically, it will be very difficult to survive this period and really harness the power of social media data.

Greg Simon of Poliwogg added to the discussion by talking about the power of the crowd in funding high-risk, high-reward research, and the tools that allow for this type of distributed funding model. He explained that although it is becoming increasingly clear in other industries that there is value in the thoughts and actions of all people, not just private investors, and entire industries are driven by assessment of these thoughts and actions, the financial community has been slower to catch on to the power of creating and evaluating social platforms to drive new innovation.

In March 2012, Congress passed the U.S. JOBS Act, which allows non-accredited investors access to private equity investments (crowdfunding). Simon explained, “If you believe there is wisdom in the crowd when it comes to finding and shaping new health solutions, shouldn’t we give [the crowd] the opportunity to help shape the future by picking the winner [financially]?”

In closing, the panelists discussed some ideas for incorporating checks and balances into the use of social media across research channels, agreeing that rating structures will need to be put into place to allow the research community to develop confidence in the platforms themselves, as well as the data they generate.

Thursday, December 6, 2012

Disruptive innovation in the biopharma industry

“Disruptive innovation” is a concept made popular by the business community that is starting to catch on in the biopharmaceutical industry. During the Partnering for Cures panel “Beyond business as usual: Disrupting the biopharma business model,” a distinguished group of panelists discussed both the problems with the current bio-pharma business model and possible disruptive solutions that would make the system more effective.

Moderator Gautam Jaggi of Ernst & Young’s Global Life Sciences Center started off the conversation by stating that as big pharma is experiencing ever increasing development times and costs, “The time has literally never been better to make changes to the R&D system through disruption.”

Tomasz Sablinski of Transparency Life Sciences and Celtic Therapeutics pointed out that it is not the people involved in R&D that are causing the problems, such as high failure rates and increasing costs, but something inherent in the playing field: “These people are smart as individuals, but there is something about the system that causes smart people to create poor results.”

All panelists agreed that biopharma companies need to figure out more quickly which targets will not work (also known as “fast fail”). As Stephen Marc Paul of Weill Cornell Medical College noted, “Last stage attrition is killing these companies. We need to figure out earlier in the process which drugs will not work so that we can create a pipeline of late-stage molecules with a higher probability of success.” 

The sharing of ideas and data is another theme that kept surfacing in the discussion. Representing patient advocacy groups, Kathy Giusti of the Multiple Myeloma Research Foundation stated that “Patients are investors too, and they can put pressure on academic centers to make their data public.” According to Giusti, this is just one example of how patient groups can be a disruptive force when their passion and sense of urgency is given a seat at the table.

Bernard Munos of InnoThink noted that sometimes “The translational challenges might be too big for one company alone; the science is just too tough. Companies can wear themselves out. Companies should try to work with their competitors and join forces.”

Ben Shapiro of PureTech Ventures believes that real disruption will come when big industry realizes that “Pharma’s role should be to invest in the best ideas out there” and that instead of doing their own internal research, pharma companies should be focused on stepping in where venture capital has left a hole and finding the best idea from the outside.

If biopharma companies would take a more nimble and innovative approach to finding and sharing the best ideas, both from inside and outside their company walls, we would see progress accelerate, panelists agreed.

Applying innovations to life sciences investment


During a recent panel at Partnering for Cures, experts discussed the financing gap in life sciences and spotlighted varying perspectives on the “VC retreat” and the need to apply innovative financial and operational models to finding and funding science that holds the promise of helping patients. Renowned leaders in investment deliberated challenges in the life sciences eco-system and challenges that prevent life sciences from becoming a better investment sector. Panelists candidly shared varying points of view – from finding the incentive to kill bad experiments to the pros and cons of collecting big data.

Jens Eckstein of SR One, GlaxoSmithKline’s independent healthcare venture capital organization, noted that “syndication is the biggest problem we are facing in the early stages.” Eckstein argued that a major issue in bioscience today is that “things go to the clinic that should never go to the clinic,” with little incentive, both from academia and pharma, to kill inefficient and bad projects. As a result, he said, an inefficient system with little change in attrition in clinical development over the past 20 years has developed, creating a more difficult investment climate for everyone. Eckstein also argued that sample size and finding individuals to curate data in a useful manner are tasks still too big to tackle.

Alastair J.J. Wood of Symphony Capital echoed Eckstein’s comments, stating “I think the current [development] model is fundamentally broken.” Wood hailed a new approach to drug development that mirrors the distributed partnering model mentioned in FasterCures’ Fixes in Financing report. Wood believes that the status quo of picking winners, as pharma tends to do, is fundamentally flawed and argued that without changing the system to spread risk adequately, there is no means of attracting new capital. On data sharing, Wood was more optimistic than Eckstein, calling for the availability of pre-clinical data and the usefulness it will provide in determining the effectiveness of compounds and viability of new drugs on the market.

Likewise, Garen Staglin of One Mind for Research and Dan Hartman of the Bill and Melinda Gates Foundation also saw huge potential in data sharing and stressed the importance of public-private partnerships and the need to bring individuals together. “By bringing people together, we have a shorter distance to go,” noted Hartman. Staglin stated that the key for developing cures can be found in bringing different sectors together and creating stronger collaboration. “Incidence rate of illnesses do not know geographic boundaries nor does the science that will help us unlock it,” he said. Hartman agreed, noting that “with access to data, we decide where to put money for products that actually help people.”

Moderator Chris Varma of Blueprint Medicine asked a poignant question: If the product development cycle is shorter in other fields, such as IT, and prediction methods to better estimate outcomes are so much better in other fields, why do we still invest in life sciences? Eckstein gave a response that summed up the feelings of the panelists: “If things work, we can make a tremendous difference.” With the right collaboration, resources, investment, and the right optimism, science has the capability to achieve amazing things.

Wednesday, December 5, 2012

Increasing the efficiency of clinical trials



It's impossible to talk about "faster cures" without talking about significantly reducing the time and cost of the clinical trials process. The challenges are well known, and many solutions have been proposed, but transformation has been elusive.

A panel of diverse experts discussed current innovations that could feed systems change, and agreed that universally interconnected electronic medical records will greatly increase the efficiency of many aspects of the clinical trial process.

Clinical trials are a critical component to the development of innovative therapies; however, trial duration has increased by nearly 25 percent and the compound annual growth rate in cost has risen by 12 percent over the past three decades. Moderator Melissa Stevens of FasterCures began the discussion by asking the panelists their thoughts on how to introduce business process improvements into the clinical trials process.

According to Robert Califf of the Duke Translational Medicine Institute, a key incremental change that can be made to the business process to improve the overall system is the development of FDA guidance on wasteful monitoring. Something as simple as efficiently handling and reducing the number of reported adverse events can save time and money by preventing investigators from becoming inundated with data that do not have significant meaning. “Every trial should be designed to answer the [clinical] question, and we need to throw out the excess baggage that comes along with it,” he said. Califf also noted that the development of the www.clinicaltrials.gov database has been extremely useful in that it has allowed more transparency into the number of clinical trials that are poorly designed and not asking clinically meaningful questions. This transparency will help to trim the fat to improve overall efficiency and allocation of resources.

Lawrence Lesko of the Center for Pharmacometrics and Systems Pharmacology at the University of Florida pointed out that figuring out how to access and leverage large datasets – “big data” – will also be critical to driving down costs and improving clinical trial outcomes. Lesko proposed that conducting metadata analysis across large data sets, such as those housed at the FDA, would enable the development of disease progression and dose response models that could then correlate to early predictors of success or failure of investigational drugs. Lesko stated, “We need more predictors to design clinical trials, especially in therapeutic areas where no one size fits all … the FDA has the expertise to do this [metadata analysis] but not the time, therefore we need to farm that task out to collaborators and consortia that don’t have vested interest in the approval process.”

Clifford Hudis of the Memorial Sloan-Kettering Cancer Center presented the “big data” issue from a different perspective. Hudis pointed out that a plethora of patient data exists; albeit in silos of paper files or unconnected electronic filing systems. This problem of disconnected data has placed the research community at a huge disadvantage, particularly when it comes to clinical trials. If patient electronic medical records were housed in an interoperable network of databases, researchers would be able to use these data to design smarter clinical trials that address more clinically meaningful questions, more rapidly enroll clinical trials, and quickly identify safety signals. This would lead to what is commonly referred to as a rapid learning healthcare system.

Susan Solomon of the New York Stem Cell Foundation highlighted the importance of embedding patients into the process of designing better clinical trials. Solomon shared her difficulty in retaining patient participation in the organization’s stem cell research. She explained that patients were excited and wanted to donate stem cells; however, participation enthusiasm was overshadowed by “little things like making sure that their calls were returned quickly or that [patients] did not have to wait too long at appointments at the doctor’s office.” Solomon added that this customer service problem could have been avoided if they had embedded patients into the process during the planning stages.

Gary Neil of Apple Tree Partners punctuated this point by stating, “As we step back and look at this ecosystem, we really have to remember that the focus is the patients … the patients are the real heroes of the entire ecosystem, because they are willing to participate in these trials and take the risk … so we need to make certain that their efforts amount to something.”

The panel agreed that in order to effectively integrate patients into the process to ensure success of these trials, the clinical community needs to invest in creating an infrastructure that increases the number of eligible patients that are enrolled in these trials. In addition, patient participation should be simple, and patients should be involved in formulating key clinical questions.

In closing, the panelists agreed that the factor that will have the most impact on the cost, duration, and intelligent design of clinical trials will be an investment in developing an infrastructure of universally interconnected electronic medical records to create a rapid learning healthcare system.