Medicines in some form have existed since the beginnings of human civilisation but it was not until the 19th century that the pharmaceutical industry, as we know it today, emerged from the amalgamation of modern biology, synthetic chemistry and free enterprise.
The basic goal - to find new and better medicines to treat patients - has of course remained unchanged. However, the challenges associated with meeting that goal are always evolving, as are the research tools available to do so. Demand for medicines globally is rising, and is likely to continue to rise, due to factors such as aging populations, obesity-related illnesses and increasing life expectancy and incomes in emerging economies.
The landscape is also changing. Targeted or precision medicines, matched to the patients most likely to benefit from them, are becoming the gold standard. This trend has been greatly assisted by the mapping of the human genome, as well as advances in diagnostic methods.
The concept that ‘prevention is better than cure’, a fundamental principle of modern healthcare, is also now a focus for pharmaceutical companies, who are increasingly seeking drugs which prevent disease, as well as treating it (the famous example being Truvada, for pre-exposure prophylaxis (PrEP) for HIV). The rise of so-called ‘superbugs’ (antibiotic-resistant microbes) is also a growing concern, as the rate at which resistance to existing drugs is emerging appears to be speeding up.
Pharmaceutical companies spend on average 17% of revenues on research and development (R&D), one of the highest R&D spends of any industry. However, it is important to remember that the sector does not comprise only ‘Big Pharma’. A huge amount of pharmaceutical innovation originates in the laboratories of universities and research organisations.