Single Molecule

The next leap in Personalized Medicine will be advances in sensor technology like Cardea

Cardea Chip with 15 graphene field-effect biosensor pads highlighted in green

Can a graphene biosensor company like Cardea Biosciences play a role in making P4 medicine (predictive, preventive, personalized and participatory medicine) a reality?

Lee Hood’s vision for P4 medicine

It was about six years ago I attended a Personalized Healthcare conference at Virginia Commonwealth University in Richmond Virginia, where Leroy Hood of the Institute of Systems Biology (Seattle WA) gave the keynote talk. In it he spoke of what he called “P4 Medicine”: Predictive, Preventive, Personalized and Participatory”. He gave a bold vision for what the future 10 years from now would look like: each of us as individuals would be ‘surrounded by a virtual cloud of billions of datapoints’ — our genome, our transcriptome, our phenome, our proteome, our epigenome, our tele-health, our social media, our iPS cells…

And where are we six years later? You can argue the original 10-year timeframe was off by some 3- or 5-fold; while over a million individuals have had whole-genome genotyping performed as part of Ancestry.com and 23andMe, and have had some risk information relayed back to them, it is a far cry from ‘predictive and preventative’. While the NIH’s All of Us Precision Health Initiative is working hard to enroll volunteers for its million-member longitudinal cohort the world of participatory medicine still is in the ‘we’re just getting started’ phase, much less billions of datapoints being returned back to us in a useful and meaningful fashion.

Personalized medicine currently is centralized medicine

Using genomic analysis to guide cancer therapy is an important clinical use of next-generation sequencing. (Disclosure: I work for Pillar Biosciences, that uses multiplex overlapping PCR for clinical NGS target enrichment.) Nonetheless, broad-based genomic sequencing for a common cancer like NSCLC (non-small cell lung carcinoma) benefits only about 15% of cancer patients genomically tested at a large academic medical center. For further background on this topic, take a look at this Pillar Post article entitled “The value proposition of precision oncology requires careful education”.

Yet in its current implementation, a blood sample or tumor biopsy sample collected in a hospital setting is taken to a molecular pathology laboratory (or shipped to a central laboratory service such as Quest or LabCorp), the nucleic acids are extracted and purified, the genes of interest are enriched for (involving well-known molecular biology techniques, whether by hybridization-capture or multiplex PCR), the next-generation sequencer instrument is run, and the data is analyzed (bioinformatics) by specialists and the report of actionable mutations are returned to the ordering oncologist.

The decentralization of information by the Internet

I was asked by my youngest child only last night what the capital of Mali was, which happens to be Bamako (a population of 3.3 million). It was a simple exercise of simply punching in my cellphone the term ‘capital of Mali’. For those of us old enough to remember life before the Internet, if we had that same question we would have to go to the equivalent of a ‘central laboratory’ – either the World Book Encyclopedia in the living room (presuming the household had the means to afford one), or trudging off to the library to look it up there.

How did the central repository of information, ensconced in encyclopedias and libraries, become liberated so that anyone with a laptop or a cellphone can obtain this information in an instant? It was the decentralization of computers, from the large mainframes of the 1960’s through 1970’s, to the personal computers and networking technologies of the 1980’s and 1990’s.

What is needed for an ‘Internet of Biology’ are new methods

Even with methods such as LAMP (loop-mediated isothermal amplification) to simplify PCR, or using electrically-charged magnetic beads to simplify silica-based nucleic acid purification is only incremental help in helping speed up current laboratory operations, not to decentralize such operations.

There currently is a flurry of activity in molecular ‘point-of-care testing’ (POCT), where miniaturized devices with ‘sample in, results out’ functionality. For one example, Cepheid’s (now a division of Danaher) GeneXpert Omni was highlighted here in 2015 and should note here (via GenomeWeb, subscription required) that the Omni has still yet to launch.

And these devices are aimed for the hospital and clinical environments with minimal training. An FDA CLIA Waiver is issued, which means these tests do not have the same complexity of other tests, and the requirements for CLIA (Clinical Laboratory Improvement Amendments) are waived by FDA regulation. CLIA requirements is a requirement in order for a laboratory to accept human samples for diagnostic testing.

An at-home test is by definition CLIA Waived. One example would be blood sugar testing for diabetics, that use glucose oxidase and the generation of hydrogen peroxide which is then colorimetrically detected with dyes. Another example is an at-home pregnancy test, which uses lateral flow immunoassay technology to detect human chorionic gonadotropin in urine.

Thus the first example is the molecular detection of glucose, that is detected enzymatically with an oxidase; the second example is the molecular detection of a hormone (a protein), detected via an immunoassay via antibody technology. Note that these are single analytes for a single purpose.

How can you get to dozens if not hundreds of protein analytes in a home setting? And what about nucleic acids?

Regarding nucleic acids, you may remember a few weeks ago this post with an update about Two Pore Guys and their single-molecule detection platform. While this would enable POCT (point-of-care testing) there remain the challenges of upfront sample preparation, including purification of nucleic acids and subsequent PCR amplification.

One method using functional graphene by Cardea

Graphene is a unique molecule, produced first in 2004 as the only 2-dimensional molecular substance, for which its discoverers (or rather inventors) were awarded the Nobel Prize in Physics in 2010.

As conductive as copper, an excellent conductor of heat, and 300 times stronger than steel, there are many remarkable properties for this novel material. Naturally there is a flurry of research and commercial interest. For example, consortia such as Graphene Flagship in the EU is funded to the tune of €1 billion, and China has led the world in a patent race (with an estimated half of the world’s graphene-related patents filed) and 3,000 companies exploring its use.

One company, Cardea, has been able to manufacture and functionalize a graphene field-effect transistor and couple it to a biological molecule to form what they call a ‘Field Effect Biosensor’ (FEB), where binding interactions take place between two molecules that can be detected electronically.

Some background on methods of studying binding interactions

One technology used for label-free antibody/antigen binding kinetics is called Bio-Layer Inferometry (BLI) commercialized by a company called Forte Bio. An older method, Surface Plasmon Resonance (SPR) is commercialized by Bio-Rad Laboratories and GE Healthcare that achieves essentially the same goal – measuring antibody/antigen binding rates in a label-free manner.

Major advantages to Cardea’s approach

These are optical methods however; by going to electrical detection of binding events, there is promise of greater sensitivity, lower sample input, fast measurement processing time, and resistance to detergents or contaminants. One important differentiator is the capability of measuring small molecule interactions (as low as 10 Daltons) using this platform.
Cardea has a chip with the graphene sensors on them (see photo) and a handheld reader device.

Image of the chip with enlarged schematics of 15 graphene sensors

 

 

 

 

 

 

The Agile R100 handheld reader

DMSO (dimethyl sulfoxide) is a very common detergent used in drug discovery and drug screening, to solubilize protein. However it raises background noise in optical systems even at low concentrations, thus causing major limitations to measure kinetics of protein targets that are otherwise insoluble. Using Field Effect Biosensing a 10% DMSO concentration has been shown to be compatible in this white paper (PDF).

Recent systems biology publication

This Field Effect Biosensing chip was used recently in a Lab on a Chip publication to identify circulating biomarkers of aging. In brief, the researchers from Keck Graduate Institute and Univ California Berkeley were studying heterochronic parabiosis, where two animals of different ages are joined together surgically.

They were able to detect labeled amino acids on a systems-biology fashion they call bio-orthogonal non-canonical amino acid tagging (BONCAT) coupled with the graphene biosensor for azido-nor-leucine (ANL) labeled protein detection. The sensitivity of the graphene biosensor, in addition to the sample input and cost, were the driving forces for use of this technology. Here’s the Lab on a Chip reference.

Note (added January 24 2019) – they have a new Nature Scientific Reports publication you can access here.

P4 medicine needs new sensing technology

Unless there are convenient and accurate and sensitive readouts of multiple proteins and other biomolecules at the consumer level, we will be stuck with the paltry measurements we take today with current smartwatches, temperature and pulserate. To realize the promise of P4 medicine, sensor technologies such as Cardea Biosciences are critically needed to provide the needed outward push from a centralized model (such as a central mainframe or central laboratory) to a personal one.

Once computing became decentralized and personal, there came a veritable Cambrian Explosion of innovation, progress, and benefit to society with the Internet. May the same thing happen with biology.

Note: Special thanks to Michael Heltzen of Cardea for an early look at this technology.

Dale Yuzuki

A sales and marketing professional in the life sciences research-tools area, Dale currently is employed by Olink as their Americas Field Marketing Director. https://olink.com For additional biographical information, please see my LinkedIn profile here: http://www.linkedin.com/in/daleyuzuki and also find me on Twitter @DaleYuzuki.

Share
Published by
Dale Yuzuki

Recent Posts

Observations about Helicos, a single molecule sequencer from 2008

Take a look at the latest blog post from Dale Yuzuki, published on the Silent…

9 months ago

Silent Valley Consulting – open for business

Silent Valley Consulting logo Life sciences marketing consulting for technical writing, content marketing, strategy and…

9 months ago

Proteomics in Proximity: Co-Hosting a New Podcast

Do you listen to podcasts? Would you be interested in learning about how Olink Proteomics'…

2 years ago

Discovery of a novel mechanism unique to severe COVID leads to a potential treatment

In a large meta-analysis, circulating protein levels of severe COVID patients along with many thousands…

2 years ago

A 2022 NGS Cost and Throughput Comparison

With Singular Genomics announcing the new G4 system at the end of 2021, and Element…

2 years ago

Element Biosciences AVITI™ Sequencing System

A new approach to next-generation sequencing, a little-known company unwraps an instrument that competes directly…

2 years ago