Wearable technology and antibiotic resistance (ABR) working parties

This abstract relates to the following papers: Spender, A., Bullen, C., Altmann-Richer, L., Cripps, J., Duffy, R., Falkous, C., Farrell, M., Horn, T. and Wigzell, J. Wearables and the Internet of things: considerations for the life and health insurance industry. British Actuarial Journal, 24. doi: 10.1017/S1357321719000072.

First, I would like to thank everybody who was involved in the Working Party, in particular, our two experts, Oliver (Werneyer) on the technology and Cother (Hajat) on the clinical aspects, who provided very useful guidance and direction as we developed our report. This is a rapidly changing space, to the extent that when we started the Working Party, we did the research on one set of the devices and obtained a good understanding of what was happening currently. However, when we wrote the report 18 months later, technology had moved on so much that we had to go back and do that exercise again on upgraded devices. The speed with which technology is coming to market and changing the capabilities and opportunities for us as an industry is phenomenal and is expected to continue at this pace for some time. It may surprise some of you, particularly those involved in the health and care space, to see that 30% of insurers are already making use of wearable technology.
If we bear in mind that the figure includes use in the general insurance community, which is further ahead in this space than the health and care community, it means that it could be that only quite a small part of the services that are being offered in the health and care are actually making use of this technology in practice. But it is certain that number will grow very quickly. We found that there are a significant number of devices. There are five categories of devices that might be beneficial to us. The ones we put on our wrists, which many of us are aware measure or track steps and so on. There are also devices that are built into clothing and worn on the body in areas other than the wrist.
There is the Internet of Things, where data is gathered from various places, digested, and reformatted for use. There are specific medical devices that are designed to address one particular disease or concern for an individual. These may not be easy to wear, or use, for a long period of time like, perhaps, the others, but are much more accurate in terms of the results they deliver. I highlight a few devices, which are five from our paper covered in the section under "body", worn elsewhere on the body and choose one from the Internet of Things. The first device on that list is called the Oura Ring, which is worn on the fingers and is designed principally to assist with sleep and understanding sleep. Many of the devices that are emerging are following that particular track but this ring appears to be the most advanced and accurate of those currently available.
Neuroon is also an interesting device. It is a mask which fits across the eyes and again focuses on sleep. This device measures EEG waves, the alpha, beta, delta, and theta waves that are generated by the brain. It monitors these to obtain an accurate indication of where the individual is from a sleep stage perspective.
The Moodmetric is the first of what I expect to be quite a large number of devices that monitor our emotional state. It evaluates the extent that stress is draining you or building you up, during the course of a day. It indicates how you are using your emotional resources, and if you are draining them on a regular basis you are going to run out at some point in time. This is the first of a number of devices that can start to track how we are feeling as opposed to just monitoring steps or something physical.
The FreeStyle Libre is a patch that goes on the arm, which monitors blood sugar levels. It does so on a continuous basis which is useful for insulin-dependent diabetics in terms of understanding where they are at a point in time and is superior to the old formats of a pinprick or a blood draw. The device allows people to see continuously how their body is digesting the glucose, at what speed, and what the peaks and troughs look like in their blood sugar profile. This is very useful information to understand from a diabetic's point of view. Then, finally, the Omron fits on the wrist and measures blood pressure, again doing so on a relatively continuous basis. It inflates and deflates during the course of the day, so instead of just getting an instantaneous measure of blood pressure, you get the measurements over the course of the day. This gives a more accurate profile of how people are responding to circumstances and situations that they are facing.
The final device on the list of the Internet of Things is Sentiance. Sentiance uses a style of measurement, which we call "frictionless". It gathers data using the host of sensors built into mobile phones. It tracks what you are doing, knows where you are, knows what speed you are going, and knows how you are changing direction. If you download all that information, you can evaluate characteristics about the person that is using that phone. So, you will know if they are a dog-walker, or a restaurant user. Sentiance also knows how you are driving your vehicle and the general insurance community has already latched onto this. The important point is people do not need to do anything for the data to arrive and be used. The element of choice is quite important. One of the challenges we are always going to have is where people execute choices to whether they are wearing a device so they can, to some extent, influence what that data is going through.
When frictionless tracking is being used this becomes less of an issue as the device is always tracking what people are doing. The data arrives through a series of sensors. A lot of them are geared around measurement of emotion, and so on. There are fairly advanced electroencephalogram (EEG) and electrocardiogram (ECG) sensors coming to the market, being tracked into miniature form, and being able to be fitted into smaller, comfortable, wearable devices. There are also sensors that are not being used at the moment. Data on things like oximetry and skin temperature might prove interesting. Others have, perhaps, more direct use. The data is passed through algorithms to determine measurements, and some are more direct than others. For example, the heart rate sensor is a direct measure as it will evaluate the heart rate and record it.
From a business perspective new devices that provide more information about conditions improve assessment of risks. For example, on fitness our research suggests that it is one of the key indicators for prospective health and longevity for the future. Up to now to measure fitness you have had to go onto a treadmill with a device on your face, and have breathing monitored to find cardiovascular fitness, the VO 2 max. Now, the algorithms are starting to improve, particularly relating to ECG data, and come up with some fairly respectable measures of fitness, and this again could be very useful in the evaluation of future risk. The galvanic skin response is the measurement of stress levels. It is the same technology that is used in lie detector tests, so it is monitoring how stressed you are at any point in time by measuring skin conductance, and using that to evaluate emotional state.
We are picking up more and more in relation to mental health, as well as physical health, and this ability is going to be important. Finally, sleep quality as well as sleep quantity is important. There is a tendency to focus on how much sleep we get, but the stages of sleep are also critical, that is, knowing when you are in deep sleep or N3 sleep, which literally recharges the body, or in rapid eye movement (REM) sleep, which is important for memory, cognition, and executive function.
One of the key questions that we had to ask, on behalf of the insurance community, is just how accurate are the data? A secondary question is, how consistent are the results? If you put these devices into laboratory conditions, you find that they are not too bad.
For example, for steps, we are seeing a ±10% variation in the results that are recorded in laboratory situations. There are a variety of accelerometers and other motion devices which algorithms then interpret to determine how many steps have been taken. The algorithms are also reasonably accurate and good enough for our purposes. However, some of the devices also report back on how much energy the body is consuming and, therefore, guide people on how much food to eat. These energy reports are, however, very unreliable even in laboratory conditions. There are many reasons why there are problems. These relate to the design of the human body, how the devices are worn and the fact that some of the data that are included rely on self-reported information. For instance, distance travelled is quite a common factor to report on, but that is derived from people putting in their height from which stride length is estimated, and then combining it with steps to give an estimate of distance travelled.
There is an important distinction being aware what is direct and what is indirect data. It is important to understand whether it has been directly picked up by a sensor and reported literally, processed through very limited algorithms, or whether there is a lot of processing in order to obtain the measure that you are looking at. Unfortunately, we do not live in laboratory conditions. The real world is more problematic. People sometimes do not wear devices. People are all shapes and sizes and do various things with their devices. People have even run the tests, with two devices next to each other on their wrist and still found significant variances in the results that come out, to the extent that the results we have seen suggest that there could be ±100% variance. Given steps measures are reasonably reliable, the accuracy of some of the absolute values should be treated with a degree of scepticism at the moment.
Accuracy is expected to improve. From a consistency point of view, it seems that whatever the errors are that are occurring for an individual, occur consistently. So when you look at the consistency of the results that are measured over a period of time, they are much more stable and more reflective of the changes that the individual experiences. Therefore, if you are looking at change rather than absolutes, you are in a pretty good space. In terms of the current uses, the traditional areas that we are familiar with are health biometrics, outcomes to some extent and behaviours. The devices and the measures that are created allow us to know our clients a lot better, and offer an opportunity for us to engage customers in new ways. If you use the data wisely you could have better interactions. There is a tendency to think about these devices as drivers of health. If we wear a device, then we become healthier. That is not strictly true. There is an association between the two, but you have to look at the context in which those devices are being used. It is not the device creating the change.
The software communicates to the individual, about the measurements that are coming through. Quite often, they will be delivered as part of some broader programme, and it is that programme that creates the change for the individual. It is supported by the device, which provides data feedback to the individual, enables motivation where people are driven to reach goals, but it is not the sole driver of change.
We have the opportunity to look at disease-specific products which we have already started to see come to market with various providers. They are mentioned in our report as picking up on diabetes as an issue. That ability to continuously monitor blood sugar is changing the game for diabetes risk. If you can demonstrate to an insurer that you can manage your condition, and do so consistently, they can start insuring that condition in a way that would not have been possible before. There is a methodology around the management of chronic conditions: if you have a chronic condition and you get better quality feedback, in terms of understanding how you are managing that condition, then you should be able to control it more efficiently.
From an engagement perspective, the devices present an important, new opportunity for the communication that we, as an insurance industry, have with our clients. It is something at which we have been historically poor, and there is a great opportunity here to engage in a much more effective way. From an underwriting perspective, there is both an initial opportunity with more data at the point of initial underwriting at the acquisition of the client. Maybe a device could be used to gather more information during the application process, or there is the continuous underwriting which is starting to emerge as a new opportunity for us to evaluate how an individual behaves, and track their performance during the course of a year. At the end of that year, the insurer can then adjust either premiums or benefits in order to reflect what the individual has actually done, given their contractual commitments to each other. Reward science is a complex subject in its own right but insurers might choose to use rewards in whatever shape or form they have agreed. The more accurate evaluation of what the individual is due by way of reward is possible.
Then, there is the accumulation of behavioural and biometric data. We may not know what to do with that data at this point in time, but that is not the point. The point is that there is clear evidence of a relationship between behaviour and outcomes. Those biometrics are a way-marker to those outcomes, and starting to gather the data and link it to the wealth of information available around claims and experience, will start to feed into the creation of better products. From a wellbeing perspective, it will be possible to track behaviour and biometrics, and use that data to create performance feedback for the individual, and coach them into a better way of behaving based on goals that they will be asserting.
We also looked at general insurance, just to understand how far ahead they were and to see if there were any lessons learnt. Tracking of vehicles is used to evaluate risk in terms of how people drive. In claims risk management, they are using devices in environments to evaluate the relative flood or fire risk to which buildings or places are exposed.
Vehicle cover is being provided as and when people need it, so just being insured for the period of the time that you use a vehicle. There is no need to own a car any more, but you still need to be insured. Obviously, it is a problem to use this approach from a health and care perspective in the sense that there is too great a risk of selection, but you can see the point that the technology allows you to do this. From a claims processing point of view, the use of video and photographic technology, that allows the sharing of information about a loss situation, helps a lot of justice and claims analysts determine the loss incurred. Then, we can consider the very dangerous subject of social media. After the recent events involving Facebook and Cambridge Analytica, it is going to be a dangerous area in which to become involved for a while. It is likely that the information that exists in the social media area is going to be used, and the general insurance community is already starting to do that. Finally, there are opportunities for home diagnosis, to sit and have a virtual General Practitioner (GP) conversation, and there are also the diagnostic tools available.
In respect of mental health, there are some brilliant virtual reality (VR) simulators, which help people with mental health conditions deal with those conditions. They can train themselves in relatively safe environments to have a realistic, but not quite real-life, experience. Our GP and physician community will become more efficient because of their ability to do more with less face-to-face interaction. A lot of the technology we are using is coming from sports. Money is flowing into sports, and the ability to drive the performance of the users, through the monitoring technology, will feed out into our community in due course. I will now hand over to Anna (Spender), who will pick up the second part.
Ms A. R. C. Spender, F.I.A.: Once we had looked at the current landscape, the devices, the measures, and how they are currently being used we went on to debate what insurers who are engaging in this area need to consider. I am going to consider some elements of data considerations, risks and challenges. I am also going to look at future developments in technology. There are some things that we previously may have thought of as science fiction, which are either here or will be soon. Then, finally, we are going to consider what does this all mean or potentially mean for insurance? One of the big advantages of using technology in insurance is the amount of data and the types of different data that potentially we will be able to collect, and the insight and differentiation that can drive. Colin (Bullen) touched on a few things though, that will give us challenges in that area. At the moment there is not a set model. There are no studies out there that are really evidence based, at least in the public domain, that we can look at to say, "This is definitely the data that is going to be significant going forward".
So, as you are building propositions and investing money into systems to collect this data, are you collecting the right data, and how is that going to evolve over time? The other area to consider is the future cost and benefit of that data and whether that cost may be increased. At the moment we do not know what is going to be significant. A lot of the measures Colin (Bullen) covered will definitely give us a good indication of the health of an individual, but are these the best measures to be used in the future? I will come on to discuss why that journey might evolve. The cost of the systems at this point in time may be higher than expected, because you will have to build in flexibility. You will have to be able to adapt those models as we learn more. It would be remiss of me not to mention the General Data Protection Regulation (GDPR) that came into force on 25 May. In the world of GDPR, firms will only be able to collect data that is relevant, adequate, and not excessive. This regulation is going to give some challenges in terms on what data to collect and we need to consider data in a new way.
It has to be secure, and it has to be treated with respect, and we have to earn the trust of users. So, insurers are definitely going to have to invest in resource and time, to make sure they are not just compliant, but have an eye to the future in this area. This is especially important because this data is sensitive as is it is related to health, behaviours and lifestyles. The right consents must be in place for that data, but maybe another challenge will be whether people are going to give you their data, and what will motivate them to do so? Arguably, the model would change with the advantages of technology meaning that insurance can become a much more engaging proposition.
If people are to be willing to give us their data because of what they are going to receive in return we need to do that on a foundation of trust. There are going to be IT constraints for insurers. Old legacy systems will not cope with this type and volume of dynamic new data and new systems will need to be agile to change.
There are also, actuarial and data analytic constraints. There are no models and tables available and what will be appropriate is going to be very different from what we have at the moment. Engaging in this business now will be on basis of limited evidence and with models built accordingly.
Another aspect to consider is the potential for fraud. We have heard many stories about the inappropriate usage of devices that have been registered to people. They have used devious means to try and get their activity up, such as giving them to a child, giving them to a fit friend, or tying them to animals such as horses or dogs. So, insurers are going to need models and algorithms that can spot and manage potential fraudulent behaviour. Colin (Bullen) discussed frictionless tracking, and the way you look at that data will also have to depend on how that data is being collected.
If there is a selective element to someone actually using a device, choosing when and where they wear their device, this is different from frictionless tracking where the data is coming without selection. It is important to consider whether the data is fit for purpose. Some of the smartphones have a lot of measures on them, but they serve a secondary purpose and may not be accurate. There is also the potential for systematic errors, in terms of the device itself. If you are looking at one individual using one device, for the time in the contract, it is probably not a problem. But, if that individual upgrades to a different device mid-term, how do you measure the impact of that, especially where incentive programmes are concerned? Also, as Colin (Bullen) mentioned where the device is worn and how the device is worn also impacts on those results. In the laboratory, the devices may be accurate, but they potentially may not be in the hands of users. We also need to consider how we look at the data. Looking at data over time versus at a point in time, it will give different results in terms of the level of accuracy and the consistency of data.
Looking at cohorts will reduce the variability in the data, but looking at an individual life, either in terms of a risk assessment or in terms of an incentive programme, may be different. The other thing to consider about data collection and data analytics is whether you invest in your own company or do you partner with an external company? There are now aggregators, which pool together multiple different data sources. They collate them, clean them, format them, and give them back in a usable form. These aggregators are typically agnostic, so any device could be used with them.
When we looked at risks and challenges in potentially viable insurance products, we looked at four different areas. These are the technical capabilities that exist and may evolve, the insurer's objectives, the regulator's objectives, and the customer's objectives. We looked at these areas in quite some detail.
The technology is changing all the time, not just in terms of what we can measure, but how we can measure it and how we can engage customers. The quality, processing and reliability of data are important and it is going to take time to build up meaningful and correct datasets. Business cases that are submitted now should be able to evolve over time. The pace of change should be considered when internal processes are put in place, so if you are going through a long proposition development cycle, it may be that by the time you reach the end of your cycle and launch your product, the technology has moved on at a pace again. A possible change in the role of the insurer is interesting, because as technology advances we will to able to be more accurate in our risk assessment. This may undermine the whole traditional model of insurance in terms of pooling risk. It is interesting to consider the impact this may have on potential customers, especially higher risk customers.
There is the possibility that customers may start self-diagnosing, and then selectively choosing their behaviour. One contentious area was around the role of the insurer. If you are setting an incentive programme, looking at particular diseases, and putting things in place that say, "These are good behaviours", and "This is what we will reward you for", are you blurring the line between insurance and medical advice? Will the wording in our policies save us from anything coming back on us in that area? Will regulation keep up with the pace of change of this technology and how it is used? Arguably it will not, given GDPR has just come in to cover a lot of data that are already been out there for some time, so insurers will have to make their own view of what regulation might look like in the future and how it will impact their products.
In terms of the customer, it is unlikely that a customer is going to be given a device and think, "Right, I am going to go and exercise now". This is not the solution. It is an enabler to the solution, but it is not, on its own, going to change behaviours. Unfortunately, just changing to good health practices is not enough motivation for a lot of people, and incentive programmes seem to help in this area in terms of getting people to address their behaviour.
Obtaining sustained, consistent usage is a complex subject. These products are going to have to be engaging. They are also going to have to be based on a foundation of trust. Security, and also clarity of how we are going to use the data is key. I will mention a few different things that we have come across, to give you a feel for what might happen in the future. Tracking emotional states, using various methods is already happening and we can see how it can move to being mainstream. For example, there are companies who have embedded this technology in their staff passes. So, their staff pass might have a Global Positioning System (GPS) sensor as well as voice analysis that can assess your stress levels and also your coping mechanisms.
There is also Bluetooth functionality that can look at your body language. If you combine that with all of your employee data, you could start entering "Big Brother" territory in learning who is talking to who, and when stress levels spike. An interesting area is microchipping for humans, which is an area that is evolving quickly. There is a Swedish technology hub, which has already offered these kinds of implants to their employees, which are embedded under the skin. It allows those employees to open doors in the building, print, and also pay for drinks. This area is evolving quickly and it will not be long, possibly less than 10 years, before this technology also monitors things like health. As Colin (Bullen) mentioned, a lot of wearable technology is developed for monitoring elite athlete performance, but it can be used in many aspects of the sports industry. Being at the forefront of using this technology might give a tactical advantage in sport. It might be in terms of improving coaching, predicting injuries, rehabilitation, or improving the customer expectation of that sport. However, there was a case last year where the Boston Red Sox were accused of cheating for using this type of technology and relaying information via Apple devices during the match to players.
Facebook have announced that they are looking at the brain computer interface in a lot of detail and their plans are to go beyond tracking emotional states, to actually reading people's minds. They have not said how they are doing this but it is coming, and it is already being used in some other areas. One can clearly see a use for exoskeletons or bionic suits that interact with the brain, in rehabilitation around spinal injuries. There is also something called the "chairless chair" that is used in industries such as manufacturing, to allow their employees to sit wherever they need to sit at any point in time. So, what does this all mean for the future of insurance? We do not have all the answers but we do not think insurance is going to drive the take-up of technology, but more and more, it will become a user of it and the data that is generated.
The elements around anti-selection and asymmetry are important. As these devices become more accurate and the ability of individuals to diagnose at home becomes much more of a reality, people will start choosing what insurance they want and when. Some insurers will keep up with this process and some will be left behind. There are definitely a lot of new opportunities and lots of ways that we can start to engage customers in a much more informed way. There will be new products, whether they are disease-specific products that need to be managed, or ones that enable totally different ways of looking at insurance and the journey that that customer has with us. Then, going beyond reported conditions and claims, we need to look at what else this sort of technology can begin to tell us in terms of people's health, and the predictions in terms of peoples' health. The collation of the data and its analysis will be a challenge that we will need to overcome. We believe to date, progress has been evolutionary. But things are changing rapidly and could become revolutionary.
There is a risk element to getting involved at the moment, but there is also a risk element in not getting involved. You have to take a leap of faith and design your product, maybe in a way that evolves over time. You are not going to have all the answers now.
Another area to consider is who will take advantage of the technology. If we do not, as insurers, take advantage, will tech-savvy companies come into the market and totally reinvent it? There are shortterm versus long-term issues. Business cases may need to change over time. So, you might design a proposition now, collect the data, engage with customers, and then, decide how you use your findings in terms of behavioural change and looking at how they can affect health, at a later stage.
I have no doubt there will be game-changing innovations in the future. There could, for example, be automatic monitoring of what people are eating. We certainly know that technology will continue to change. The only question for everyone is, when will you get involved and how?
In conclusion, this is a rapidly developing area. We could probably go and research today and there would be more things out there. There is definitely a market opportunity for us, but it also could be a catalyst for change, in terms of how we engage with our customers and how we are seen by our customers. Technology is only part of the solution, you need a bigger proposition around it to really engage the customers. Just giving them the technology is not going to change anything. Data considerations are key as is the ability to look at what regulation may come in the future. We must always look at the data from an ethical and moral standpoint.
There is a risk the market fragments through being able to assess risks in a different way leading to a reduction of the pooling system. We are going to see more specialist products coming to the fore for dealing with specific diseases. It will be an evolving business case and product development journey for people.
Mr D. Simmons (opening the discussion): I have two comments: one relates to the mood or emotional space. We need to be clear about what data we are collecting, for example, we might be collecting picture, voice, or some electronic measure, which we are then deriving or interpreting as some emotional mood. The other observation is from the general insurance industry about how you persuade people to use this technology. Now, in the case of young drivers, for example, most insurers insist that they have some tracking device fitted to their car. So, for that particular group, they have literally forced them to engage. Do you have any thoughts on the effect of this? Is there a timing effect? If you have the device fitted to your car perhaps you may drive a bit more carefully for a few months. After a while, you may forget the device is there, and perhaps the claim experience then reverts to what it would be if the device was not fitted in the first place.
Mr Bullen (responding): To respond on the data side, the point could be made that the important part of this journey is that we are going to be accessing more and more data that we are not used to holding. Interpreting that data, using it in the right way for the right reasons, and generating trust with our clients is important. We are going to have to go beyond the legislation. If we are going to have any kind of relationship with our clients, we are going to have to build it and be very aware of the nuances in the way the data is collected.
Mr M. R. M. Elsheemy: It is good to see the impact on life and health, but might the technology impact on other fields where actuaries work, like pensions. People may live longer and be healthier: should we consider looking at the impact of wearable technology to improve the health of the general population. I understand even the government is considering using wearable technology for this purpose.
Ms Spender (responding): We definitely focussed on the life and health area and did not consider pensions. They are a consideration. There is certainly the aspect that we discussed, in terms of which populations will interact with this technology easily and share their data easily, and which will not? For pensions it is going to depend on what time in the life cycle of the pension that you consider.
Mr M. Kirkpatrick, F.I.A: Have you come across any examples of where this technology is being used in the NHS? Alternatively, are there any impediments being put in place by government or regulators, or anything that they could do to help encourage the industry?
Mr Bullen: The example that springs to mind from the research we did on the Internet of Things is that there is a lot of work being done around the frail and elderly. There are a variety of investigations or experiments being undertaken, where they are introducing various forms of tracking technology, which includes spatial technology to monitor people moving around and pressure pads. There is also communication technology which feeds data from the care environment, to the individuals who are providing the care. The principal behind that being that it should involve less resources on-site, to provide a much safer and more secure environment for individuals in elderly or dementia situations. The NHS is looking at a wide range of possible other uses.
A questioner from the audience: Given that the insurance industry writes a lot of guaranteed products that go out 20 or 30 years, is there a danger in those products where people start antiselecting against them in the future?
Ms Spender: It is going to depend on how the different products evolve, and how peoples' use of the technology evolves so that they have the information they need to be able to anti-select. Yes, some of the long-term guaranteed products that are already in place potentially could be a risk, if people find they have different information to make decisions.
The Chairman: Our second working party this session is the Antibiotic Resistance Working Party, which was established in January 2017 with the aim of developing a simple modelling framework for the impact of antibiotic resistance on mortality. The final model will be released in a sessional paper next year, but in this session we have the opportunity to hear about an overview of their motivation, and an update on their work to date. We have three speakers representing the Antibiotic Resistance Working Party: Nicola Oliver, Matthew Edwards, and Ross Hamilton. Nicola Oliver is the founder of Medical Intelligence, established in 2007, a consultancy dedicated to providing impartial expert insights into the main drivers behind changes in life expectancy, and disease risk for the insurance industry. Prior to that, Nicola worked as a nurse in the National Health Service, in intensive care and in public health. Nicola is an affiliated member of the Institute and Faculty of Actuaries (IFoA) and is chair of the Diabetes Working Party, as well as being deputy chair of the Antibiotic Resistance Working Party.
Ross Hamilton qualified as an actuary 2 years ago and has been working in pensions consultancy over the past 7 years. He recently moved to work in pensions risk at Lloyds bank, looking at asset liability monitoring, investments, and assumption setting, including longevity for accounting and funding purposes. Ross volunteered for the Antibiotic Resistance Working Party to help further his understanding of mortality drivers, and has been working on the modelling and literature review work streams.
Matthew Edwards works in the life insurance practice of Willis Towers Watson, leading work around mortality, longevity, and policy holder behaviour. He is particularly interested in the application of advanced analytics to life risk, the interface between actuarial techniques and medical expertise, and innovation. He is primarily interested in innovation through repurposing.
Mr M. F. J. Edwards, F.I.A.: We have had an interesting talk looking towards the future in a positive, optimistic and almost science-fiction way. Our presentation is almost the opposite. It is rather looking backwards, and picturing how things might work if the whole of medical science in respect of antibiotics goes back 50 years. Regarding our title, if you are sufficiently concerned by our talk and want to find out more, there is a good book called The Drugs Don't Work by the chief medical officer, Professor Dame Sally Davies (Davies et al., 2013). This 100-page Penguin introduction of the subject is very good to follow-up if you want to do a bit more research without joining our Working Party. What we want to do here is give you a very quick overview of what we are trying to achieve in the Working Party, which is trying to come up with some sort of useful modelling structure. The medical overview will be covered by Nicola (Oliver), and then Ross (Hamilton) will talk through the model structure and the parametrisation. I will then give some results.
We have been modelling and parametrising for one pathogen, so we can take a look at that and have a sense of the order of magnitude of the results.
To give some background to the Working Party: almost exactly 2 years ago, in my capacity on the mortality research steering committee, we thought it would be a good idea to have a session with a couple of talks and papers on antibiotic resistance. The subject seemed to be more and more in the news and was something, which was going to have a potentially material impact on mortality and longevity. We had an event around the theme of antibiotic resistance and were encouraged to carry on our work, and developed a model that actuaries would find useful to be able to further the work in this area. Existing models did not seem very suitable for actuaries. For instance, the O'Neill Review (O'Neill, 2015) which had come out round about that time, talked about various models developed by Rand and KPMG. It is hard to penetrate the surface of these and find out what is going on. One of them is built around economic drivers. It does not really seem to delve into what you might expect from a mortality model, and the other was equally obtuse. So, we considered it would be useful for actuaries to have a model which would help them to think through the impacts of antibiotic resistance and whether they are going to be material. Also, what would you do about stressors for your internal models? So, those are the reasons we decided to form the Working Party with the aim of developing an interesting and useful modelling framework. Figure 1 outlines the members of the Working Party. I am the chairman of the Working Party. Nicola Oliver has been finding a huge amount of useful medical input. Sheridan Fitzgibbon is working on model structure and parameterisation. Craig Armstrong from Aviva and Ross Hamilton have all been developing the model structure and parameterising the model. Craig Armstrong drove the project forward, so we put a 2017 note out in his name, but he is no longer active. The figure shows several other people important for various other aspects of the work including the literature review and providing general moral support. I will hand over to Nicola for the medical overview.
Ms N. Oliver: The first question is what is antibiotic resistance? The answer is based on the medical understanding that has helped us to shape some of the parameters and data points that we have put into the model. Antibiotic resistance is a survival of the fittest scenario. It arises quite spontaneously, but it is also something that is passed on from one generation of bacteria to the next. Simply put, in the presence of an antibiotic, the bacteria that are self-selected to be resistant survive. Those who are not resistant, die. Then, you have a generation that continues, of resistant bacteria or pathogens. Bacteria are very well-placed to be able to spread the resistance they have a very large population and their evolution happens at an extremely fast pace. Each generation of bacteria can reproduce, sometimes in as little as 20-30 minutes and they are able to pass on their resistance in two different ways. The standard way, which is known as vertical gene transmission, is where reproduction takes place, the cells split, and any resistant criteria are then passed onto the daughter cells. Bacteria also have a very clever way of developing their resistance by spreading it horizontally. They can simply pass DNA material to other bacterial cells. It does not require reproduction in order for that to happen. In the presence of an antibiotic the susceptible bacteria are killed, the resistant ones thrive and continue to pass on their information. Antibiotics kill the bacteria in two ways. A bacterial cell has a very simple structure with the wall, containing plasmids and DNA material, etc., inside. The bactericidals work by interrupting the pathway that the bacterial cell uses to build the wall around it, so simply weakens the cell, and basically, the bacterial cell just splits. There are a large group of antibiotics that fall under that particular umbrella.
The most likely one that you will know are beta-lactams, that is, penicillin. The second group, known as bacteriostatic, work on the inner workings of reproduction of the cell. In essence, they slow down the rate at which the bacterial cell can reproduce, rather than actually killing it. That allows the body's immune system to kick in, because the bacteria are no longer multiplying quickly. Because of the process involved this class of antibiotic require a sufficient duration of time in order to work, and that is where the problems arise regarding not taking antibiotics for a correct period of time, which is absolutely essential. Some drugs also have a crossover effect in that they can have an impact similar to both of the two classes discussed above. It has become globally accepted that antibiotics are no longer as effective as they used to be against some of the major pathogens. The question is, why has this happened?
The first explanation goes back to the evolutionary system. Bacteria, like all species, are subject to survival of the fittest, and therefore in the face of some antibiotics, they will thrive. Simply using antibiotics will create resistance. The second explanation is that antibiotics have been overused and used inappropriately. Antibiotics are only effective against the bacteria. They will not work against the virus. Most of the general, ordinary, day-to-day infections that we all experience are viral infections that are not going to respond to an antibiotic. However, we live in a culture where there is an expectation that if we are sick, then we will take a pill, and then we will get better. So, we are given antibiotics for viral infections, and they obviously are not going to have any effect whatsoever. The other issue that is quite key is around the food chain, and this is becoming more and more of interest. Back in the 1950s in the US, chicken farmers discovered that chickens that had been administered antibiotics grew very quickly and it became accepted that you gave antibiotics as a growth promoter into the food chain.
This resulted in an expectation of cheaper food to our tables. The problem is that those antibiotics remain present in those animals, and we are ingesting them. That happens in chicken, beef, and pork. In addition, antibiotic-resistant bacteria are also present in those animals, and will remain active in undercooked meat. Now, we have a lot of legislation in the EU which prevents the use of regular antibiotics in animals and in spraying of some fruit crops, but that does not control for any wider populations. Antibiotics are allowed to be given to farm animals in the case of illness and they are then a route into the food chain. In summary, we know how antibiotics work against bacteria. We know how bacteria work against antibiotics, and we now know how those things have made it into the human system. So, what is the size of the problem? Well, we know that there are a large amount of pathogens that are causing problems. The biggest problem is around the incidence of a blood infection called septicaemia or bacteraemia or sepsis.
You are not going to die from a urinary tract infection or a wound infection, but you will die if those bacteria enter your bloodstream. In fact, the fatality rate of septicaemia in severe cases is 80%. This is because, if a bacterium enters your bloodstream, your immune system mounts an extremely overblown response, and attacks your major organs. You go into organ failure, and you die. We are exposed to the risk of antibiotics and failure of antibiotics throughout our whole life. We do not routinely give antibiotics in childbirth any more, but there is certainly a peak in use at those times. Meningitis and trauma in childhood attract the use of antibiotics. Going forward through the age range there are also circumstances where antibiotics are given for prophylactic purposes. For example, certain types of chemotherapy render the immune system extremely vulnerable. You are open, then, to potential infection. Also, any type of surgery that involves placing something inside the body and leaving it there, such as heart valve replacement or any type of joint replacement has infection risks.
The number of people that receive a hip replacement each year in this country runs into many thousands. We could be faced with a scenario where we do not do the surgery because we do not have effective antibiotics, so you are left with somebody with a prolonged state of disability. The main routes to a septicaemia, are through the urinary tract, through the respiratory tract, through the skin, and through the abdomen. All surgeries create a risk, not just of disability and morbidity, but also increased mortality.
In 2016, the World Health Organisation (WHO) put together a priority list of pathogens which they felt were the greatest threat, and concluded that they wanted to instigate increased research into development of more appropriate and effective drugs.
One of the other issues around antibiotic resistance is that there are very few new classes of antibiotics. Whereas we have, each year, many hundreds of new approvals for mainstream chronic diseases, such as cancer and heart disease. I do not believe there has been a novel class of antibiotic for probably 10-15 years. This is partly because the route to discovery is more complex, but also, because it is not particularly economically advantageous to pharma companies. If you produce a blockbusting drug that treats blood pressure, the patient will be taking that drug for the rest of their life. In contrast, you take an antibiotic for a week, so there is much less financial incentive to undertake the research and development.
The WHO put together a list from a selection of 20 different pathogens, and they wanted to assess it against a number of weighted criteria, which are shown on the left of Figure 2. They came up with a top 12 list of critical, high, and medium pathogens, in terms of priority as shown in Figure 3. These were the ones that gained the highest score, if you like, in the weighting analysis.
What we are going to do now is have a brief look at those top three in terms of what type of infections they produce, what the risks are, and what their current status. The first one is something called Acinetobacter baumannii; we will call it A. baumannii. Other critical pathogens are baumannii, pseudomonas, and a whole family of different bugs called Enterobacteriaceae. We will start with A. baumannii. Now, one of the chief concerns with this one, and this is why it makes the top of the list, is that it is particularly prevalent in the hospital setting. So vulnerable patients, who are already exposed to other factors, are also at high risk of contracting this bug. It is a very "clever" bug. It is able to survive on dry surfaces for very many months, and it is known to be driven by poor infection control, such as, basic washing of hands, and by overuse. One of the biggest concerns is that A. baumannii has now been discovered to be resistant to a lastcase-scenario type of antibiotic called colistin. This is an antibiotic that was developed in the coalition and not used very frequently, because it causes damage to the kidneys, but in a risk-benefit scenario, often, you have to go down that route.
If you consider an intensive care patient, there are lots of different ways your patient can get an infection. You stick needles in them. You stick tubes into their bladders. You put tubes into their veins, and you put a nice, big tube into their lungs. A. baumannii is associated with extreme cases of pneumonia, wound infections, urinary tract infections, and ultimately, a bloodstream infection. The antibiotics that are available to treat this are starting to become a very short list indeed.
The next down the list, pseudomonas is a very common infection. It is a bacterium that is found most widely in the environment. It is responsible for very mild infections, as well as some very serious infections. It does have a very similar profile to the previous pathogen, being very prevalent in hospitals, driven by poor attention to hygiene in terms of basic handwashing, and is very much associated with a very similar set of infections, such as pneumonia and wound infections, then, ultimately, bloodstream infection.
The last group that I am going to look at is from the Enterobacteriaceae group. This is a large family of pathogens which all respond and work in a very similar way. You may have heard of a couple of members of that family: Salmonella is one, and E. coli is another. This group of pathogens are responsible for quite a lot of nasty types of infections, in particular, digestive tract infections. We have already discussed the standard respiratory tract and urinary tract infections, but in the last 10 years, it has become evident that E. coli is becoming more and more responsible for the numbers of bloodstream infections that we are seeing. So much so that Public Health England started to monitor the rates of E. coli infections, and particularly to monitor the rates at which people were dying from blood-borne based E. coli infections. The main route for an E. coli bacteria into the bloodstream is via the urinary tract. It is a very common cause of urinary tract infections which become untreatable.
That infection then ascends to the kidneys, then it enters the blood, and then that is where you have the bacteraemia. Because Public Health England were very interested in these rates of infection and rates of death, it made it easier for us to gather the relevant data, and E. coli is the pathogen of choice for our modelling exercise. We will be moving on to look at more pathogens in the very near future, and in particular, those ones from that World Health Organisation critical list.
This issue is very important. Dame Sally Davies spoke at the Spring lecture last year and has brought this to the fore in the UK, in terms of bringing it to the attention of the press, writing extensively about it. In fact, about a year ago, there was a pilot study run for GP surgeries in England, where several letters were sent to GPs, to try and encourage them to do less prescribing. This had a reasonable effect, so small measures are being taken. The O'Neill report estimated very many millions in future deaths, if we do not do anything about it. As I mentioned earlier, we have a problem with finding new drugs, because of the way that the model is based for pharmaceutical reimbursement. There are also problems in changing peoples' behaviours.
A study published a year or two ago found that between 2000 and 2015, when you would expect to see a reduction in antibiotic use, there was a 65% increase around the world. This is incredible considering the messages that are supposed to be going out there. This has been driven mostly by low and middle-income countries, as they start to increase their wealth. We see a correlation between increasing GDP and increasing use of antibiotics. There has also been a rise in the ability to buy antibiotics, and bypass the whole prescription process. You can do this in some countries, as an over-the-counter event, and you can probably buy antibiotics from the Internet. In this particular study, one of the biggest issues, and this comes from India, is something called a fixed dose combination, where drugs are made cheaply, and then several types of drugs are combined in one pill. So, if you needed something for an infection you might find in that pill, an antibiotic for a different type of infection, or if you took a prescription for something else entirely, you may find hidden inside there is also some more antibiotics.
More recently in the news was the first case of antibiotic-resistant gonorrhoea. This concerned a gentleman who returned from the Far East, and then the usual combination of oral antibiotics did not work and he required 3 days of intravenous antibiotics. Additional cases are emerging around the world, so even infections which we have previously found extremely easy to treat are now sometimes problematic.
I would like to leave you on one or two notes of optimism. One of the greatest discoveries has been understanding the method by which we are able to identify new antibiotics. That has always been a challenge, partly because each time scientists start to look for new antibiotics, they just keep finding the same ones. However, there is now a discovery platform that has allowed scientists to look in samples of soil. In particular, they have managed to isolate a class of antibiotic, the malacidins, which we know are particularly effective against multi-drug-resistant pathogens. A second interesting study is related to genomic data which is able to identify which bacteria have a genetic propensity to be resistant, and to maybe even start to manipulate that resistance to our own advantage, which leads me to the last study here. Many of you may have heard of Clustered Regularly Interspaced Short Palindromic Repeats (CRISPR); it is a way to be able to edit genetic material, simply by snipping out the parts that you do not want, and inserting healthy parts. This is something that could be done inside a bacterial cell just as effectively, so there are studies that are starting to look at manipulating the DNA of a bacterial cell in order for it to turn on itself and destroy itself. This is one of the ways in which we could tackle antibiotic overuse without having to discover new antibiotics.
That concludes the medical overview which informed our decision-making about the type of pathogen to investigate, and how we were going to model. I will hand over now to Ross (Hamilton).
Mr R. O. B. Hamilton, F.I.A.: I am going to briefly take you through the model structure and parametrisation work streams and what we have done so far, although it is still a work in progress. To start to determine what type of model structure to use our first step was to clarify our objectives. We settled on modelling the effects of both mortality and morbidity due to rising or variable resistance to antibiotics in the future. The next stage, after defining the objectives, was to look at the relevant literature. This was to inform us what research has been done, which we can use to inform what we are going to do. The O'Neill Review was a great starting point for us because it was a very high-profile review.
There were two models produced by KPMG and Rand, which we looked at and decided that, although their approaches were useful as a starting point, we certainly wanted to do something different for our purposes. We then looked a bit further afield and found some meta studies which were helpful, because they did all the work for us by categorising a lot of the research that has been produced in the last 15 years. They categorise that research by the question it was trying to answer, and also, by the model used in trying to answer that question. That indicated that there were a variety of ways we could look at this, including, regression analysis, survival models, significance testing, to name but a few. Furthermore, there was not a single approach in terms of model structure to any particular question. Now, we had a good base from which we could work on, in terms of looking at all the different types of models and what structures have been used, and we just wanted to firm up on the structure of our models.
The models needed to be complex enough to model the scenario that we were trying to illustrate, but not overly complex so that, for example, the data burdens were too great, or the data did not exist to populate the model. Finally, we wanted to make a model that was adaptable, so that users could input either their own data or their own assumptions about future resistance, to form their own view on how they think this is going to impact morbidity and mortality in the future. Taking all these factors into consideration the model structure that we decided to use was a multistate Markov model. It was a relatively simple model and well-known to the actuarial field. There is good availability of data for the transition rates and it produces the appropriate output. There is an illustration of the model structure in Figure 4. There are typical healthy and dead states, and then in the middle, there are two sickness states.
There is one sickness state where you have an infection that is susceptible to antibiotics, and then the one on the right is where you have an infection that is resistant to antibiotics. Once we decided on the type of structure for the model, we then set up a work stream to consider parametrisation. In Figure 4 there are sigmas near each of the arrows. These denote the transition intensity between each state and the next, and those transition intensities can be informed by data that we can observe and have been collected.
Turning to the data sources that we have used and some of the issues that we have come up against when looking at that data, so far we have only looked at E. coli: one of the reasons for this was the availability of the data. For example, Public Health England has started producing statistics based on the incidence of people getting E. coli going back to 2013. So, while obviously, we would like the data set to be larger, this is still a useful starting point.
This Public Health England data, showing us the number of reports of E. coli in each year, split by various different age groups, was very helpful. We have had to supplement that data in order to obtain a split between which of those infections are resistant to antibiotics and which are susceptible to antibiotics. So, we have looked to the European Centre for Disease Prevention and Control (ECDC), or more specifically, the European Antibiotic Resistance Surveillance (EARS) Network, which is the Network. The EARS Network produces rates of resistance to various different pathogens, and to what type of antibiotic they are resistant, going back to around 2002 across Europe. That data is split by various different regions, so for example, we are able to look at the UK specifically. In addition, it is split it down by various different age groups. So, we can take this resistance data for the various different years and combine it with the Public Health England data to split the reports into those that relate to E. coli cases that are resistant to third-generation cephalosporins and those which are susceptible.
There are some limitations of the data. There is little information on how the level of resistance interacts with the level of instance, and you might think intuitively that they would interact in some way, but there just is no evidence of this at the moment. Finally, and this is a common theme throughout most of the data that we have collected, is the issue of potential bias. For example, Public Health England is looking at healthcare-acquired infections and this could understate the level of infections that are being reported by not including those outside of the healthcare setting.
To move onto the next level, we investigated mortality rates, from E. coli in this instance. Public Health England is producing some information on the case fatality rates of E. coli, and again, that is split by various different age groups, and there is some data going back to 2013. However, again, there is no split in the data between that which is resistant and that which is susceptible to antibiotics.
For this investigation we have had to supplement this information with some academic research papers that we had found from the literature review. We have been able to use the relative ratio of how severe the case fatality rate is with a resistant version of E. coli relative to that with a susceptible version of E. coli. There is a limitation in that the data is not quite available in the granular detail that we need. Again, there is also some bias within this data, just in terms of what populations are being sampled. We have now been able to take all this information and populate our base scenario for the different transitions between the various different states. However, an important element of this model is looking at how resistance and trends in resistance might impact the movements between the states of the model. As noted earlier, the EARS Network has data going back to 2002 on antibiotic resistance in a UK setting, split down by pathogen and the antibiotic to which it is resistant.
We are able to take this data, and we extrapolate it into the future. For our initial parametrisation, we used straight line extrapolation to get a feel for what might happen to mortality and morbidity in the future. We would also like the model to be able to look at different trends in resistance. For example, there might be some type of exponential growth in resistance or, potentially, you might have some type of tapered resistance. The future of resistance will be important here in terms of trying to define the parameters that go into that resistance function which could define that shape of resistance into the future. Nicola (Oliver) has mentioned the education pieces that are going on worldwide and how that might impact trends in resistance by reducing consumption. At the same time there are factors that might work in the opposite direction, for example, the facts that no new classes of antibiotics have been discovered in a long time, there are not necessarily any new ones in the pipeline at the moment, and the lead time to producing these types of drugs can be well over 10 years. We are hoping that the model will be able to capture, in a basic way, some of these effects in projecting out resistant.
Matthew will now discuss the results. Mr Edwards: Figure 5 gives a sense of the order of magnitude of the results that we have obtained so far with E. coli, with the projections parameterised along the lines Ross (Hamilton) was discussing. We are looking at the working age population, between 19 and 64. In the figure, we are projecting the position in 20 years' time. We are going to provide various different scenarios, because there are intuitively two very likely outcomes, neither of which are that resistance is just going to gradually increase ad infinitum. It is going to be either resistance increases until cures are found and it goes down again, or it ends up compounding up out of control.
We have provided almost a mean of those two positions, as that may hopefully give some sense of what might happen, but in itself is a fairly unlikely event compared to those two slightly more extreme positions. What we have found from this approach is overall, a 1% increase in mortality rate in 20 years from one strain. The 1% is in relative terms not absolute terms. This is very small unless you have massive exposures. If you look at it in terms of what the impact is on mortality improvements, it is slightly more interesting, because obviously the improvements themselves are an order of magnitude less than the mortality rates. If you consider four or five pathogens, and assume that they all have similar impact in 20 years there could be of the order of a 0.2% primary reduction in longevity improvement rates.
Although this is small compared to the absolute rates of mortality, it is still just about material in terms of the impact, in terms of mortality improvement. These findings are just from very indicative graph modelling. This is broadly half way between a disaster scenario of things going terribly badly, or a good scenario where things go badly and then people step in and cures everything. Hopefully, however, it gives some idea of the orders of magnitude, which are a lot less gloomy than some of the things you may hear in the headlines, or some of those Nicola (Oliver) alluded to in her presentation.
The other interesting thing we can do with this model is look at plausible stressors. We can look at some of the 95% confidence intervals around our projections which are giving us increases in overall mortality, in 20 years' time, of the order of 10%-20%. Again, that is not quite as catastrophic as what you might have expected from some of the headlines. Hopefully, that is of interest in terms of positioning where we are at the moment.
Regarding next steps, we are looking to carry on the model development over the summer. One of the problems we have had with the Working Party is persuading people who have day jobs to volunteer large amounts of time. We are recruiting a couple of new people through Cass Business School. We are confident we are going to obtain a fully developed model with four or five pathogens validated and documented which we will be able to present at the sessional meeting at the start of next year. This will include a full set of results and details about the parametrisation. We will also publish the spreadsheet model with parameters. This is all going to be based on UK data: if you are interested in other territories, you can use the same sort of framework and structure. We trust our presentation has given you a good idea of the direction of travel at the moment, and an interesting overview of the main issues.
Mr M. G. White, F.I.A. (opening the discussion): Can we put all of this into context considering overall worldwide pandemic risk, from all causes, including viruses? Are we directing our concern to the right areas?
Mr Edwards (responding): If you are an actuary interested in mortality and longevity, you want to consider all the different relevant drivers, for example, diabetes or epidemics. The aim of this Working Party is just to look at the effect of ABR in isolation. I agree it would be nice to be able to consider a wider scope, but believe what we are doing is going to be useful.
Mr B. P. Ridsdale, F.F.A.: It struck me that a 20% increase in mortality in 20 years is actually quite large. Most deaths occur at over age 65 and you said you were modelling up to 65. It seems to me that older people are in hospital more. Do you see a substantial increase in mortality in the over 65s, and will you be putting that into your model?
Ms Oliver (responding): You are correct in saying there is an increased risk in the older groups, not least because the immune system deteriorates as one ages. In terms of morbidity, we are aiming to examine, the impact of things like not being able to undergo surgery that would otherwise give a more comfortable life. Plus, the fact that certain types of chemotherapy may not be able to go ahead. There are certainly aspects around different age groups that we will be considering. Another rather UK-centric issue is looking at local health resources and whether or not certain surgeries may well not be an option anyway.
Mr R. Hall: Has the Working Party looked back, approximately 30 years, to the AIDS Working Party, and how their model developed and responded to further data? Because, when one sets up one of these models, it is interesting to see how you can factor in additional data as it comes in, and what time series of data is available.
Mr Edwards (responding): We discussed last year when starting to think about the Working Party itself the precedent set by the AIDS Working Party. We are trying to avoid what happened then, which was a sense of panic and extrapolating the curves upwards until they ended up being completely non-credible. We have not considered how to ensure the model is best positioned to incorporate new data once it has been released. That is quite a useful point, so thank you for raising.
Mr D. B. Pye, F.F.A.: You focussed on UK data but this seems to be a potential global issue. To what extent are other countries and other bodies looking at this, and to what extent is the Working Party looking to have a global view, especially given how people migrate across the globe?
Mr Edwards (responding): There are two aspects of the question. One is the global aspect, and the other is the modelling of other bodies. In regard to the second aspect, we have had quite a few conversations with the Wellcome Trust, and other people within the NHS who we think would be very interested in model development. Clearly, they are to some extent interested in us developing a model for them. The interesting thing about that was that they seemed to be doing nothing at all  about it themselves. Inside the actuarial profession, it is slightly worrying that nobody seemed to be doing any serious modelling themselves. Ms Oliver: We spoke to the Society of Actuaries in the US and I do not think that they, at that time, had set up an organised approach to the issue. This may have changed because that was around about a year ago, but yes, we have reached out.
Mr Edwards: The other consideration is the global aspect. I fully agree this is important. There is an analogy with climate change, whereby if you spend a 100 billion pounds, or whatever it might be, decommissioning or converting all your power plants in the UK but, China and India are still spewing out their emissions it is all a bit pointless. The management of ABR is very similar. If you persuading all the UK GPs to reduce their prescriptions but the Italians and the French carry on prescribing far too much, and the Indians and the Chinese continue feeding all their livestock antibiotics, then the individual actions here are going to be of pretty much no consequence. We are not responsible for global management and are rather pessimistic about whether any sort of management framework would work, if it is not adopted globally as opposed to just in one country.
The Chairman: I would like to ask a question of Anna (Spender) and Colin (Bullen). This is a rapidly evolving field and our insurance industry could be accused of being notoriously slow at times. What are your thoughts about the risk of us being left behind?
Mr Bullen: We have been slow responding up until this point, particularly from the health and care perspective. That is changing as these types of conversations are starting to happen. A number of insurers within the health and care space have already started to move and started a conversation about what can and should be done. There is an implied threat from the community that has data already, and can do things that we cannot do, because we do not have the data. At the moment, that seems to be a relatively mild threat. It did not emerge on our analysis as being something that was of inordinate concern, but it is there. The data aggregators that Anna (Spender) discussed may look beyond their current sphere of interest and motivation into something broader, like the world of insurance. So, there are definitely risks of being left behind in a variety of scenarios by the data aggregators or competitors within the insurance industry. If you wait for the answer to come to you in this space, you will be left behind.
There is a risk of being irrelevant fairly quickly, but as Anna (Spender) also highlighted, there is also a risk of diving in and being committed, and getting nothing out of that commitment. There are a variety of risks around being left behind. Doing nothing is not an option, but also, there is care to be taken around what you do, and what you do with the information that you collect.
The Chairman: I would like to address a final question to the other working party. There were two themes mentioned in terms of the situation in which we find ourselves. One was a behavioural one, in terms of the overuse of antibiotics. The other one was that the economics of R&D do not add up. On the second one, as we move into a more dire situation, won't the economics correct themselves?
Ms Oliver: I understand that there is quite a lot of movement towards completely changing the business model, in terms of rewarding pharmaceutical companies producing new antibiotics. It will be more of a one-off type of reimbursement, rather than the current model, which is sale by volume, which does not fit with the antibiotic field. That is something that is being discussed at quite senior levels of government both globally and locally and which will have to change.
The Chairman (closing the discussion): I would just like to give some very brief, concluding remarks. We started this session with Anna (Spender) and Colin (Bullen) making us feel quite optimistic about the future, with its bewildering range of wearable devices. There are some concerns in terms of the accuracy of the data they are recording, particularly in a real-world setting. So, there will be data considerations that we have to think about. Are we capturing the right data, and how can we even cope with that flow of data that we might be expecting. We then came back to earth with the Antibiotics Working Party. We started off with the medical side, and Nicola (Oliver) talking about the behavioural side and the overuse of antibiotics. I found a 65% increase in the use of global antibiotics between 2000 and 2015 quite shocking. There is some optimism to be had in terms of new compounds, and also possibly CRISPR being used to replace antibiotics.
We also had an advert for next year's sessional meeting which will be in February 2019, where you can see the fully completed model delivered. So, with that, I would like to thank the authors and everybody who contributed to the discussion.