Risk factors for communicable diseases in humanitarian emergencies and disasters: Results from a three-stage expert elicitation

Background: Humanitarian emergencies including disasters associated with natural hazards, conflict, complex emergencies and famines can pose significant risks to public health, especially when they lead to population displacement into inadequate conditions. To reduce the risk of communicable disease outbreaks in such situations it is necessary to know the key risk factors, their thresholds (quantitative risk factors only) and their relative importance in different types of emergencies. Methods: We conducted a three-stage structured expert elicitation. Experts from the fields of health protection and humanitarian assistance were invited to complete three successive online questionnaires. Experts were asked to choose the 20 most critical risk factors and in subsequent rounds to determine thresholds for urgent (yellow threshold level) and critical action (red threshold level). Additionally, experts were asked to assign weights for the risk factors in different emergency types. Results: We identified 20 key risk factors, which include factors related to water, sanitation and hygiene, access to health care, vaccination, nutrition, political will and others. Nine out of the 20 risk factors were quantifiable, for those risk factors yellow and red thresholds are given. 11 risk factors were qualitative. All risk factors scored highly when weighted in different emergency types and differences between risk factor weights in different types of emergencies were limited. Conclusion: Communicable disease risks in humanitarian emergencies are a nexus of complex and often interrelated individual issues. Knowing key risk factors and their thresholds and weight in different types of emergencies can help guide emergency response and risk reduction efforts.

While the problem of a potentially increased risk of communicable diseases in humanitarian emergencies is well documented, information on specific risk factors and the levels at which these risk factors become critical is lacking. Yet, the identification of risk factors and their interaction is crucial for risk management. Knowing the overall risk profiles can help identify those sites where proactive interventions may reduce the impact of communicable diseases. Key risk factors for communicable diseases identified in the academic literature can be broadly grouped into categories such as Water, Sanitation and Hygiene (WASH), health and public health system, environment, humanitarian response, infrastructure, insecurity, living conditions, nutrition, mass population displacement and economy (23). Within those broader categories, individual risk factors are defined more specifically, although the categories themselves serve as general risk factors as well (1,2,(23)(24)(25)(26)(27)(28)(29)(30)(31)(32)(33). While similar groups of risk factors have been identified as significant for all emergency types, their weights can differ depending on the individual setting, as does the overall risk of a communicable disease Hammer C, Brainard J, Hunter P. Risk factors for communicable diseases in humanitarian emergencies and disasters: Results from a three-stage expert elicitation. Global Biosecurity, 2019; 1 (1). outbreak. For example, as Floret et al. (21) noted, the risk of communicable disease outbreak is almost negligible in geo-disasters that do not trigger a secondary disaster such as a displacement crisis. For each site, it is also important to know which risk factors are of the most pressing concern to allocate resources correctly and prioritise interventions.
In this paper, we summarise the results from three stages of structured online expert consultations we performed to determine the 20 most critical risk factors (across all types of humanitarian emergencies), the thresholds for those factors that could be assessed by a quantitative indicator, and their weights in different types of emergencies. These data were later used in the development of a rapid risk assessment tool to be used by non-experts to assess needs and priorities in humanitarian emergencies. The factors selected to be the 20 most critical were included in the tool and the thresholds and weights for each factor were used as the basis for a risk score for each factor and a combined overall risk score. The risk factors identified, their weights and thresholds, and especially the rapid risk assessment tool do not substitute detailed needs assessment and are designed to rapidly assess communicable disease outbreak risk and, as such, are not a suitable basis for humanitarian programming.

Methods
We conducted a three-stage structured expert elicitation.

Recruitment and participants
Participants who self-identified as having experience in health protection and/or humanitarian assistance were invited to take part. Participants were recruited by email through dedicated listservs that cover areas such as health protection, public health intelligence, humanitarian assistance and disaster studies as well as through the personal and professional contacts of the research team. Participants were then guided to an online questionnaire.
Recruitment included personalised emails to 16 individuals we knew professionally and via dedicated relevant listservs. Recipients were encouraged to share with interested colleagues. Most of the targeted individual recipients had recent field experience supporting response to humanitarian disasters. Table  1 lists the affiliations of targeted individuals and the specific list serves; most affiliations were with public health agencies, charitable aid organisations and/or research institutions. Many targeted respondents had multiple relevant affiliations. To help assure confidentiality we did not ask during the survey for identifying information such as current employer, job title or years of experience. The specific Email listservs we used and characteristics of the individuals we personally asked to fill in the survey are listed in Table 1.
Questionnaires are included in the supplemental files. Participants could fill out one or more of the three stages of online questionnaires. Participation in a previous questionnaire was not required to take part in the second and/or third stages. The first questionnaire asked participants to identify the 20 most critical risk factors from a list compiled based on the wider literature and a recent literature review by the research team (23). The first questionnaire also asked participants to assign weights (on a scale from 0-5) to each risk factor to allow the calculation of a weighted average for each factor. The weighted average was calculated from the mean score of level of importance (on a scale from 0-5) times the number of participants selecting this weight for this factor. Weighted averages were calculated in case the initial mechanism for selection of the 20 most critical factors based on how many participants considered them to be in the top-20 proved to be inconclusive. In the second questionnaire, participants were invited to assign yellow (urgent, action required) and red (critical, action required immediately) thresholds for all quantifiable risk factors.
The third and final questionnaire sought to identify the respective weights (on a scale from 1-5) of the 20 most critical risk factors in nine different types of emergencies, as broadly described by Spens and Kovács (34). The types of crises were: famine (F), complex emergency (CHE), conflict (C), refugee and IDP camp (RC), flooding (FL), geo-disaster (GD), protracted crisis (PC), tropical storm (TC) and tsunami (T). Complex emergencies describe situations in which widespread internal or external conflict has led to a complete breakdown of authority and widespread damage to society. They are defined by requiring a multi-facetted, multi-agency international response (23,35). Conflicts include inter-and intrastate warfare, civil war and insurgency. Geo-disasters include earthquake, landslides, volcanic eruptions and other disasters caused by geological hazards. Flooding refers to fresh water flooding. Tropic storms include Hurricanes, Typhoons, Cyclones and similar hydrometeorological hazards. This list of types of emergencies was not meant to be complete or to comprise mutually exclusive types of crises. Displacement crises are usually an additional humanitarian emergency secondary to conflicts, complex emergencies, or disasters associated with a natural hazard. However, we believe the risks for communicable disease outbreaks differ significantly enough for these to form distinct categories.

Analysis
Answers were collected online and analysed in Microsoft Excel. Weighted averages, median and mean scores were calculated where appropriate. Additionally, correlations were done in SPSS version 23 using Pearson correlation.

Responses
The first questionnaire was completed by 21 participants; the second questionnaire was completed by 24 and the last questionnaire by 25 persons. We only stored, recorded and analysed completed questionnaires and not those left half-completed in order to comply with the possibility for participants to withdraw consent to partake until the end of the survey. Given that the surveys were advertised widely, this represents a relatively small proportion of possible respondents. However, it is not possible to characterise the actual response rate.

Risk Factors
The first questionnaire sought to identify the 20 most critical risk factors, irrespective of the emergency type and their relative importance. The 20 risk factors chosen by the most respondents (see column 'Selected (n)' in Table 2) were input to the Stage 2 and 3 surveys. 19/20 of these also had the overall highest weighted average scores (see Table 3).
Thresholds Table 4 shows the expert-identified yellow and red thresholds for the nine quantifiable risk factors. A yellow threshold indicated a situation of concern that should be addressed as soon as possible while a red threshold indicated a highly critical situation that needs to be a top priority. These thresholds are described individually below.
Access to clean water was measured in litre per person per day. The median red threshold was 2 (mean 5.25, SD 5.01) litres and the median yellow threshold 6.5 (mean 10.5, SD 8.92) litres.
The available number of hospital beds per 10,000 persons was used as a proxy indicator for the risk factor health care facilities. The median red threshold was 5 beds (mean 18.77, SD 27.28) per 10,000 persons and the median yellow threshold was 20 beds (mean 45, SD 54.70) per 10,000 persons.
The median red threshold for functioning toilets was 4 (mean 4.92, SD 4.95) toilets per 100 persons and the median yellow threshold was 9 (mean 10.86, SSD 11.74) toilets per 100 persons.
The number of health professionals per 10000 was measured in three categories. The median red threshold for doctors per 10000 persons was 1.5 (mean 19.21, SD 35.24) and the median yellow threshold was 5 (mean 27.31, SD 55.91) doctors per 10000 persons. The median red threshold for nurses was 6 (mean 96.79, SD 256.24) per 10000 persons and the median yellow threshold 10 (mean 63, SD 111.29) nurses per 10000 persons. The median red threshold for community health care workers was 8.5 (mean 15.86, SD 26.18) per 10000 persons and the median yellow threshold was 20 (mean 42.46, SD 55.51) community health care workers per 10000 persons.
Vaccination coverage was measured for the following four diseases: measles, meningococcal meningitis, polio and hepatitis B. The median red threshold for measles vaccination coverage was 75 % (mean 67.21, SD 23.46) and the median yellow threshold was 90 % (mean 81.92, SD 14.88). The median red threshold for meningococcal meningitis vaccination coverage was 72.5 % (mean 62.21, SD 23.92) with a median yellow threshold at 80 % (mean 73.08, SD 21.53). The median red threshold for polio vaccination coverage was 75 (mean 64.31, SD 25.89) percent with a median yellow threshold of 87.5 % (mean 83.33, SD 12.80). The median red threshold for Hepatitis B vaccination coverage was 50 % (mean 52.00, SD 23.90) with a median yellow threshold of 72.5 % (mean 70.83, SD 17.42).
Hammer C, Brainard J, Hunter P. Risk factors for communicable diseases in humanitarian emergencies and disasters: Results from a three-stage expert elicitation. Global Biosecurity, 2019; 1 (1).
These figures -especially the seemingly 'high' figure for the yellow threshold must be understood in the context of the impact of mal-and undernutrition for the severity of communicable disease outbreaks through mechanisms such as increased susceptibility and greater shedding and transmission. Poor nutritional status is a common attribute of affected populations in many humanitarian emergencies and is known to exacerbate the size and severity of communicable disease outbreaks. (1,24,(36)(37)(38).
The median red threshold for the distance between human waste disposal and housing was 20 metres (mean 71.00, SD 138.53) and the median yellow threshold was 50 metres (mean 79, SD 89.60). Table 3. Weighted averages of the importance of the risk factors in humanitarian emergencies and disasters, irrespective of emergency type and setting. 0= Not selected/not important; 1= A little important; 2= Important; 3= Quite important; 4= Very important; 5= Extremely important. Green indicates those factors included in stages 2 and 3 while the factors marked in red were discarded after stage 1.

Weights in different emergency types
Weights for the different risk factors were similar for different types of emergencies, with only minor differences (see figure 1 and tables 5 and 6). On a scale from 1 (not important) to 5 (very important) all included risk factors score above 4 (both mean and median) when combining all emergencies. The only two risk factors with a median of 3 were 'insufficient nutrient intake' and 'lack of health education' in the context of a tropical storm. Mean values for all risk factors in all different emergency types (not combined) remained above 3.4, except for 'lack of health education' in the context of flooding (mean 3.29, SD 1.14, median 4) and 'lack of health education' in the context of a tropical storm (mean 3.22, SD 1.28, median 3). This suggests a reinforcement of the importance of these risk factors across different humanitarian emergency types.
There was considerable correlation between risk factors, demonstrating the highly interactive nature of risk and risk factors in humanitarian emergencies as well as the complexity of such situations (see table 7).     Sphere standards are neither intended as risk assessment nor are they specific to communicable diseases. Secondly, the Sphere standards have a normative component, as they indicate standards that should be reach based on ethical considerations rather than those that empirically relate to changes in the level of risk experience. While this makes the Sphere standards an unsuitable comparison, it might be interesting to see how this difference in approach shapes the suggested thresholds. Sphere standards indicate a minimum of 15 litres of water per person per day. (49) Our survey found a yellow threshold for clean water availability at 6.5 litres per person per day. This difference is explained by the fact that the thresholds we sought to identify are only thresholds for increases in disease outbreak risk. A yellow threshold for clean water at 6.5 litres per person per day does not suggest that a person does not need more that 6.5 litres of water per day but rather that below that the risk for a communicable disease outbreak critically increases. Additionally, some of the risk factors and especially their measurements are simply proxies. This becomes clear when looking at vaccination coverage. The selected vaccines are not meant to be the main, the only, or even vaccination priorities at all in all emergencies but rather they are used as proxies to estimate the reach of vaccination programmes. Keeping this in mind, the measures and risk factors identified are entirely unsuitable to base humanitarian programming upon. This should follow a suitable method for needs assessment -which obviously communicable disease outbreak risk assessment, which the factors suggested here are meant for, is notand an estimation of minimum standards based on internationally accepted levels such as the Sphere standards.
In contrast, the thresholds identified by our surveys indicate precise and transferable tipping points for levels of risk. They are the first step towards developing a rapid risk assessment mechanism for communicable disease outbreaks in humanitarian emergencies that, rather than asking the person or persons completing it for qualitative and personal assessments of the severity without any indicators what this should be based on, uses pre-defined thresholds and risk levels against which a situation can be judged. Hence our thresholds are hopefully useful in real world risk assessment, because they identify specific risk thresholds using simple quantitative indicators.

Limitations
While we made every attempt to maximise participation, the main limitation of this work is the small number of respondents. However, it can be argued that the field of experts suitable for participation is not large. Our expert opinions are in line with assessments in scientific literature of the relative importance of different risk factors. Expert elicitations have their limits and are subject to biases (50,51). Overconfidence in the results of expert Hammer C, Brainard J, Hunter P. Risk factors for communicable diseases in humanitarian emergencies and disasters: Results from a three-stage expert elicitation. Global Biosecurity, 2019; 1 (1). elicitations should be avoided (51). Hence, we do not recommend accepting the results without further inquiry, even if they are mostly in line with the literature.
Additionally, the above-mentioned lack of specification and possibly blurred and broad definitions of some of the risk factors is a potential limitation. That would certainly be the case if the results from this research would be used uncritically to make decisions in the field, even if they were used just for risk assessment without further additional investigation. However, considering that we do not recommend using these results beyond the realm of risk assessment and that for risk assessment we considered this research to be a first stage within a larger research project, the results form a good starting point to understand expert opinion on some of the most critical risk factors for communicable disease outbreaks in humanitarian emergencies.

Conclusion
Communicable disease outbreaks remain a significant concern in the aftermath of emergencies and disasters, especially in low-and middle-income countries. Broadly, expert consensus seems to be that WASH, access to healthcare, nutrition and wider societal and emergency specific factors are among the most important indicators and risk factors for communicable disease outbreaks in such situations. These factors remain important across different types of humanitarian emergencies. Beyond establishing current expert opinion, this research also serves as a starting point to assess and improve risk assessment tools, methods and protocols for communicable disease risks in humanitarian emergencies and disasters. Current risk assessment tools, such as the WHO tool used in the context of the EWARN system (52, 53), also use individual risk factors. However, there is a strong need to make risk assessments clearer and more explicit by using, where possible, previously determined risk factor thresholds that can be assessed without expert knowledge in each domain. Ideally, this risk summary would be based on an independent needs assessment and require minimal additional primary data collection in the field. The expert consultation described in this article, combined with a systematic review performed in parallel (23) and additional research by the research team, seeks to be the basis for such a pragmatic, easy-to-use and novel risk assessment tool. No system captures the complexity and diversity of humanitarian emergency settings perfectly and even accepted international standard such as Sphere are under constant revision and do not cover all aspects of humanitarian response. However, such a risk assessment tool can be seen as an attempt to capture some of the main risk factors for communicable disease outbreaks in such settings, especially as it does not assume considerable expert knowledge from the person or persons using it, like the WHO's risk assessment tool for communicable diseases in humanitarian emergencies does (52, 53).
Hammer C, Brainard J, Hunter P. Risk factors for communicable diseases in humanitarian emergencies and disasters: Results from a three-stage expert elicitation. Global Biosecurity, 2019; 1 (1

Introduction
Recently, the healthcare industry has been facing a new type of hazard; bad actors have started targeting hospitals and other healthcare facilities for cyberattacks. This industry is particularly vulnerable to cyberattacks because healthcare providers depend on up to date information from electronic health data. This information includes patient histories and test results, which is often needed at a moment's notice to provide critical patient care. Approximately 95% of hospitals in the United States use health information technology, such as electronic medical records (1). Many other health technologies, including glucose meters, IV pumps, and implanted medical devices, are also connected to and dependent on the hospital's network. With patient safety on the line, hospitals may be more willing to pay for restored access to their network. Healthcare organizations (HCOs) have become much more reliant on health information technology over the past decade. Another vulnerability that makes hospitals susceptible to cyberattacks are the out of date cybersecurity systems at many facilities and limited training for staff on safe cyber practices (2). These characteristics combined make HCOs good targets for attack (1,3).
The cyberthreats that HCOs now face are complex and can come both internally and externally to the network (4). In a survey conducted by the Healthcare Information and Management Systems Society (HIMSS) of healthcare organizations, 37.6% of respondents said their most recent security incident was caused by an online scam artist, whereas 20.8% reported a negligent insider and 20.1% reported a hacker as the cause (5). There are also many points of entry in to a healthcare network, which have the potential to make them extremely vulnerable (See Figures 1 and 2). A point of entry is a way for bad actors to gain access to a hospital computer or network in order to achieve something malicious, whether that be stealing data or delivering a payload virus (6). Some points of entry identified in the HIMSS Cybersecurity Survey include email, infected hardware or software, compromised medical devices, third party website, and a provider or a service linked to the network via the cloud (5). Some additional points of entry include internet access, a wireless network, removable media (i.e. USB drive, laptop), or theft of equipment (6). In the 2018 HIMSS Survey, 61.9% of participants identified e-mail (e.g. phishing e-mail) as the point of entry in their organization's most recent significant security event. Another way that hackers attack is through backdoors or unpatched vulnerabilities, which are essentially access points left open across the network. Figure 1 displays a sample hardware network of an HCO. Each switch on the diagram represents multiple devices connected to the network, and each device presents their own multiple points of entry via e-mail, the internet, or USB connections. Depending on the level of network cybersecurity, an infected phone being connected to a system computer or an infected link from an email being clicked can potentially transfer a virus to the network and spread. Figure 2 shows an example of a software network within an HCO. In this example, there is a virtual interface with a corporate office with its own clinical and administrative management software. There are also interfaces with many different applications used around the organization, including imaging, labs, pharmacy, payroll, and patient scheduling. Each of the applications represents potential points of entry for bad actors to break in to the organization. HCOs must rely on their corporate interfaces as well as third party vendors to keep their products secure with up-to-date protections. With so many different points of entry in to the HCO hardware network, these networks have become extremely intricate and therefore highly susceptible to unauthorized access. This complexity also serves to make the networks hard to secure. Figures 1 and 2 are based on small hospital network, but the connectivity displayed in each diagram, a central hub that interacts with many different devices and applications, is a set-up seen in the typical U.S. hospital.
Hackers use different attack techniques to take advantage of HCO vulnerabilities and gain access to the network. A common type of attack is a phishing scam conducted over email. Hackers send an authentic looking email to hospital staff and include a link or attachment that unsuspecting users open or click. Once that content is activated, the hacker gains access to the network and can get information or activate a malicious virus (6). Phishing scams are on the rise; there was a 789% increase in phishing e-mails from the last quarter in 2015 to the first quarter in 2016 (7). A second type of attack is a malware attack, which is when malicious code or virus is dispatched within a computer network (4). One example of malware attack that is of growing concern for healthcare organizations is ransomware. In the HIMSS 2018 Cybersecurity Survey, respondents ranked perceived threats and ransomware is now second on the list (11.3%), whereas natural hazard (i.e. fire or flood) was eleventh on the list (8.3%) (5).
During a ransomware attack, bad actors will lock users out of a network and demand a ransom payment to restore access. The first ransomware attack took place in 1989 when an AIDS researcher, Joseph Popp, sent 20,000 floppy disks to AIDS researchers in 90 countries. The floppy disks were said to contain a questionnaire to help determine patient's risk of contracting AIDS. When inserted, these disks infected the computer with a virus that lay dormant until the 90 th time they were turned on. Once the computer was booted for the 90 th time, a note would appear on the screen asking for licensing fees to be paid while locking the user out of the computer (3). Since 1989, ransomware attacks have continued and are now categorized as one of two types: scareware and crypto ransomware. Scareware will inform a computer user there is something fatally wrong with their machine and offer a solution for a small payment. Crypto ransomware is much more complex, in that it will encrypt computer files so that they need a certain decryption key to be opened. These crypto-viruses have become a lot harder, and many times impossible, to break even by experts (3).
Similar to the first ransomware attack, hackers have again shifted their targets to the healthcare industry. In healthcare, this type of attack can essentially shut down an organization's ability to operate and lock providers out of essential data needed to provide patient care (8). In May 2017, a global ransomware attack known as WannaCry was perpetrated by the North Korean government (9). Hackers utilized a stolen National Security Agency (NSA) tool that highlighted a vulnerability of Windows Operating Systems to gain access to 300,000 computers across 150 countries (9)(10). During this attack, 36 health organizations, including hospitals, ambulance services, and physicians' offices, in Great Britain were locked out of their systems (11). WannaCry forced the National Health Service to send patients away from certain facilities in order to receive the care they needed (11). Homeland Security experts have said this attack directly put patients' lives at risk (10).
This type of cyberattack against organizations has become more frequent in occurrence (12). In April 2016, there was a 159% jump seen in ransomware attacks from the month before. This was a huge rise from the normal 9-20% monthly increase that had previously been seen (13). In 2015, across all industries, the Federal Bureau of Investigation (FBI) reportedly received more than 2,500 ransomware complaints, which cost the victims $214 million (14). A 2016 IT report stated 93% of phishing emails now contained ransomware (7). In 2018, the city of Atlanta fell victim to a ransomware attack and lost many of its critical municipal systems. This attack alone cost the city $2.7 million to recover (15).
In February 2016, an outbreak of ransomware attacks against United States hospitals began at Hollywood Presbyterian Medical Center in Los Angeles, California. The hospital was offline for over a week before deciding to pay the ransom (16). Approximately $17,000 was paid and the hospital regained access to its operating systems (17). Since this initial attack, there has been a surge in reported malware attacks of healthcare providers across the United States. These attacks can be extremely costly for HCOs (18). A hospital in New York was attacked in 2017 and it has been estimated that their recovery cost was almost $10 million, including hardware, software, extra staff hours, overtime hours, and loss of business costs (19). The on-going fixes and upgrades to the hospital system are estimated to be an additional $250,000 to $450,000 a month (19). In the most recent HIMSS Cybersecurity Survey, 75.7% of respondents reported a significant security incident in the past 12 months (5).
The best way for hospitals to protect themselves is to be proactive and take steps to strengthen their potential vulnerabilities and weaknesses. Hospitals need to conduct risk assessments to better understand how large the risk malware attacks pose to their organization, as well as how big an impact successful attacks can have on operations. Once they have a risk analysis of malware attacks, HCOs can decide which fixes to their system make the most sense financially to offer the most protection.
Lack of reliable reporting on frequencies and impact of this type of attack make it difficult for the healthcare industry to better secure their systems. The risk reports that do exist do not expand on the nature and scope of these successful attacks. Some of these incidents only affect a few computer terminals, whereas other incidents have a more significant impact on the organization and have the potential to affect patient care and safety. Due to the inherent nature of hospitals and the initial ransom payment made by Hollywood Presbyterian Medical Center, these types of incidents are only expected to continue to grow in frequency.
Currently, there are popular media reports on these attacks, but there is no methodology for consistently tracking hospital attacks over time. This study seeks to address this gap by assessing the trend of malware attacks on HCOs over time. This objective will be achieved by reviewing publicly-reported, successful attacks on healthcare organizations within the United States between 2016 and 2017. The final product of this analysis will be a timeline of reported ransomware attacks on hospitals, as well as a summary of what data is being reported with each attack. A logic diagram will also be developed to show the process of a malware attack on an HCO. Without a better understanding of this type of threat, healthcare organizations cannot adequately protect their organization or their patient's safety (4).

Methods
A content analysis was conducted of news articles related to hospital malware attacks. The new sites Healthcare IT News and Becker's Hospital Review were used as data sources. Healthcare IT News is a site published by Healthcare Information and Management Systems Society (HIMSS) and is one of the most comprehensive news sources for information on healthcare information technology. Becker's Hospital Review is another well-known and reputable source of information related to information technology in the field of healthcare. A search of these databases was conducted using a combination of the keywords "hospital" or "healthcare", "malware" or "ransomware" and "attack". These articles were reviewed for relevance to the research question. Inclusion criteria for articles were references to malware or ransomware attacks on hospitals or healthcare facilities within the United States during 2016 and 2017. Articles that discussed data breaches caused by hackers or misplaced hardware, as well as articles that discussed phishing scams, were excluded from this analysis.
The included articles were analyzed to identify cases, which were then were formatted into timelines to summarize the number and locations of reported malware attacks. Upon further investigation and research, each case was also reviewed for date of attack, name of facility or organization, location, how many facilities were affected, what the impact on the facility was, and if any outcome was disclosed. If the articles referenced a data breach, that information was cross referenced with the U.S. Department of Health and Human Services Office of Civil Rights Breach Report Database. The HITECH Act requires that all data breaches impacting 500 or more individuals be reported in this database. This data was put in to a table to summarize the extent of publicly-reported malware attacks on United States hospitals between 2016 and 2017, and to identify trends within this dataset.
A logic diagram was also created to illustrate a malware attack on a hospital network through a phishing attempt. This diagram walks through the steps of a phishing ransomware attack in which a hacker gains access to the network. The logic diagram was created using data collected during qualitative interviews with subject matter experts, including a Chief Information Officer, a Chief Information Security Officer, a Senior Network Administrator, and a Healthcare IT Manager. It uses a hypothetical hospital to show the extent of a successful phishing attack, and the breadth of access to data and applications a hacker could potentially gain in to a secure network.

Figure 1. Hardware Network Diagram
Note: Below are brief explanations of the purpose each hardware device in this figure. A server is a computer that either provides information to other computers or stores files which can be access from other computers. A router is the director of communication traffic between devices (e.g. computers). A firewall is a form of security used to keep unauthorized users out of a network. A mainframe is a computer where large organizations store their critical applications that are access through the network. A switch is a networking device that connects multiple computers to the network. The internet connection is the organizational connection to outside networks.

Malware Attacks, United States 2016-2017
Overall, this study discovered 49 reported cases of malware attacks on U.S. Healthcare Organizations during 2016 and 2017. There were 22 malware attacks in 2016 and 27 malware attacks in 2017. Figures 3 and 4 present these healthcare attack cases, respectively. This analysis has shown attacks occur all over the country and take place all year long. The data collected showed there were malware attacks on HCOs in 13 states in 2016 and 20 states in 2017. A map of the United States displaying frequency of malware attacks for both years is shown in Figure 5. The state with the most attacks was California with 9 attacks across both years. There were 16 states that saw one attack across both years. Both years had attacks reported in 9 different months. The attacks are affecting more than just hospitals across the country. One attack against a health system impacted 10 hospitals and 250 outpatient clinics in the D.C./Maryland region. Another attack against a health system saw impacted hospitals across state lines. Some of the attacks only impacted one facility, but often that facility lost access to its medical records.
Each of the 49 identified cases did not have the same impact to their respective healthcare organization. Tables 1 through 4 present impact details of the identified malware attacks. Forty-one of the cases were labeled as 'ransomware' attacks (shown in Table 1). The articles reported that at least six organizations paid ransom (shown in Table 2). In one case (Kansas Heart Hospital), the hospital paid ransom and the hackers released only a portion of their files before demanding a second ransom. They did not pay the second ransom demand (20). The other cases either did not pay or did not disclose a payment to the press. Some of the articles reported outage times for the organizations, which ranged from 1 day to about 2 weeks (show in Table 3). The most frequent time offline that was reported was one week. The first ransomware attack against a hospital, Hollywood Presbyterian, paid $17,000 after a standoff with hackers and almost two weeks offline. Another major impact identified was compromised patient or staff records. Sixteen of the attacks reported no records breached. Seventeen of the attacks reported less than 50,000 records impacted. The highest number of records reported 500,000 breached records, with three other attacks reporting more than 200,000 breached records (shown in Table 4).
One of the issues identified while completing this content analysis was the lack of consistency in reporting and defining this type of attack. Across all identified cases, there were different search terms required to identify certain cases. Table 5 shows the different terms that were required to find different cases. Ten of the cases only showed up in searches using the term "cyberattack", eight only showed up using the term "malware", and ten only showed up using the term "ransomware". The other 21 cases were identifiable using more than one of the listed search terms. This lack in consistent reference words make it difficult to fully identify all reported cases.

Logic diagram
Due to the complexity of healthcare organizations, there are a few steps hackers must go through to gain access. Figure 6 presents the steps as they would occur in an email phishing attack. The attack begins when a hacker sends mass emails to employees within an organization attempting to deceive at least one employee. The email would either contain a malicious link or attachment within that would allow the hacker to gain shell credentials to the organization. With the counterfeit credentials the hacker can impersonate the employee within the system, and depending upon the level of access they have, gain direct access to network applications or they can find another user credential with higher level access.
Once the hacker gains administrative level access, they can permeate across the organization's network to find the information they are looking for. In this scenario, Figure 6 shows the applications and confidential data the hacker would gain access to in this HCO. The software applications include timekeeping, imaging, medical scribing, catheter laboratory services, obstetrics and gynecology clinical services, the network email exchange and all organizational file shares. From this access, the hacker has access to protected health information, proprietary business data, payroll information, and other confidential data, such as social security numbers of patients and staff members.  If the hacker's goal is to deliver a malicious payload, such as ransomware, the hacker can choose where to drop it once they gain access to these organizational applications on the network. They can choose a location which would cause the biggest service disruption to increase likelihood the organization will pay the ransom demand.
Once a hacker gains access to the HCO's network, the HCO itself has limited options on how to stop access. The first step is that the HCO must realize they have someone with malicious intent inside their network. Often in the case of ransomware attacks, this does not happen until applications stop working or a ransom note appears on desktops across the organization. In cases like this, it is imperative the HCO shuts everything on the network down to stop the spread of the virus and to cut off the hacker's access to the network. This step would also cut off all users' access to the network and cause a complete organization-wide downtime. Once the network is shutdown, the HCO can conduct impact assessments to see how much damage has been done, if any, and can begin their recovery and business continuity processes. If the HCO decides not to shut down the network, the hacker has continued access to the network and the virus can continue to spread infecting more hard-drives.

Discussion
Over the last few years, we have seen an increase in this trend of cyber targeting healthcare organizations. This content analysis found 49 instances of malware attack on U.S. healthcare organizations during the years 2016 and 2017. These attacks occurred all over the country; with 27 states having a reported attack during this period. The attacks also impact all areas of healthcare delivery, including hospitals, primary care, outpatient clinics, medical suppliers, and electronic medical record providers.
With aspects of care delivery at risk, malware attacks are a threat to patient safety (6). The 49 attacks identified through this analysis had ranging levels of impact, but all were required to go offline for a period of time to stop the spread of the computer virus. Providing care without access to patient history can be hazardous. For example, without the system's automated checks and balances in place while prescribing medications, there is a chance that something in the patient chart gets overlooked. Medical devices are also at-risk during malware attacks, including therapeutic equipment (infusion pumps), life-support equipment (ventilators) and diagnostic equipment (PET scanners). Any of these devices can serve as backdoors in to healthcare networks if not secured. One report reviewed three case studies where medical devices were used by hackers to break in and move through a network (21).
Malware attacks can also affect patients and staff in ways other than through provision of healthcare services. Attacks can have direct impacts on the facility itself, which potentially has downstream impacts on patient care. At least one of the attacks from this analysis saw impacts to their security systems. The hospital's security cameras went offline and they were forced to go in to lockdown until the cameras could be brought back online. Another system potentially at risk is the HVAC system. Without environmental temperature regulation, there is the possible need for evacuation of patients. Finally, as seen in other cyberattacks, the electrical grid and water treatment are also potential targets (22). Without power or clean water, hospitals could no longer provide care and would also be required to move patients. Evacuation of a hospital is an extreme undertaking regarding staffing and resource needs, as well as finding equivalent bed capacity to take patients. An extreme example of the impact of power loss and evacuation on patient care was seen during Hurricane Katrina at Memorial Hospital where physicians decided which patients to save and hastened the death of others (23). This is the first known content analysis to develop a list of malware attacks across the healthcare industry. One limitation of this research is the reliance on public reports of attacks. Not all attacks are being reported and most of the reported attacks are large scale incidents. Based on FBI and HIMSS data, we know that this is a much bigger problem. The FBI urges HCOs to report attacks, but ultimately this is left up to the discretion of the facility. Attacks are only required to be reported when medical or financial information has been compromised. One reason for not reporting is that HCOs do not want to risk their reputation or income by being labeled a victim. This reporting loophole makes it much harder for the industry to get a clear picture of the attack trend (24). Another limitation is the lack of consistency in reports of each attack. This study tried to combat this inconsistency by using multiple search terms including 'malware', 'ransomware', and 'cyberattack'. With different terminology used in reports, there are potentially cases that are being reported but might not be captured by the content analysis. Even with this limitation, the dynamic understanding provided through this content analysis will illustrate the frequency and types of cyberattacks, which has not been previously researched. The sample of this analysis only includes successful attacks, but there are also many more institutions who are vulnerable to attack (5). There is a need for the healthcare industry to push for more public data regarding this hazard. If attacks were reported to a single database, this information could be accessed in one location and used to better educate healthcare administrators on the risk that cyberattacks pose to healthcare delivery and to business continuity. This information could also be used to better develop a more accurate hazard vulnerability assessment (HVA) for HCOs. A wellinformed HVA is the basis for effective preparedness and response planning within emergency management.
In 2018, this trend against the healthcare industry continues to grow. As of September 2018, there have been reported malware attacks every month of the year affecting health systems, hospitals, third-party medical suppliers, hospice care, provider clinics, and medical device manufacturers. Healthcare Organizations have a few recommended actions they can take to protect their networks, including developing a security culture within the organization. It is recommended that HCOs teach safe-use habits to all staff and test on these rules. There are also IT solutions to protect against cyberattacks, such as the use of strong firewalls, antivirus software, intrusion detection and even limiting network access (21). Another avenue HCOs can explore in preparing for cyber threats is procuring cyber insurance. The costs of attacks are estimated to be in the trillions worldwide by 2020 (25). Cyber insurance is a way to protect the HCO enterprise. Insurance companies will do a full assessment of an organization's IT capabilities and offer differing levels of coverage for a price. Often, insurance does not cover loss of revenue from downtime during attacks (25). As this type of threat continues to evolve, so too will cyber insurance policies.
Cyber threats to our society are only expected to grow over time. A 2017 article from the American Public Health Association cited a cyber-firm report that estimates that over the next five years, cyberattacks would cost the United States Healthcare system $305 billion in revenue and these attacks would affect 1 in 13 patients (26). Due to the relatively low number of cases identified in this content analysis, a follow-up systematic review on this topic would be appropriate to compare reporting trends of these events. There is also a need for future research in this area to better define what happens within an HCO during an attack. Further review of attack cases could highlight lessons learned and potentially identify best practices. This research will help HCOs better understand this hazard in order to prepare for and plan for mitigation of this threat. The healthcare industry has a choice to make when it comes to emergency preparedness: are they going to prepare their organization to prevent threats and protect patient health, or are they going to rely on the recovery of cyber insurance?

Introduction
Though smallpox (causative agent variola virus (VARV)) was eradicated in a global triumph in 1980, it remains a threat as a category A bioterrorism agent. Two known caches of VARV still exist in the United States (US) and Russia, however more stockpiles of the virus could exist elsewhere (1). Advances in synthetic biology has led to increasing concern of smallpox (possibly antiviral-resistant) being synthesised from scratch (2). Given that smallpox vaccination ceased in the 1970s, most of the world's population is immunologically naïve or has waning levels of protection (3). While the first line response to an outbreak would be vaccination, up to 25% of immunocompromised individuals are contraindicated for vaccination. Another available countermeasure is vaccinia immune globulin (VIG). However, it can only be synthesised through the purified blood products of vaccinees and is hence in short supply (4). Therefore, it is imperative to develop other counter-measures that can be used to manage outbreak of smallpox or other orthopoxvirus (OPXV) and smallpox vaccination adverse events (AEs).
The most viable antivirals available for treatment of OPXV are cidofovir (CDV), brincidofovir (BCV) and tecovirimat. CDV (HPMPC; Vistide) is a nucleoside analogue with antiviral activity against dsDNA viruses and is currently approved for treating cytomegalovirus (CMV) in AIDS patients (5). In a smallpox emergency, CDV could be made available by the US Food and Drug Administration (FDA) as an Investigational New Drug (IND) or Emergency Use Authorization (EUA) (6). The mechanism of action is to block viral DNA polymerase, preventing viral replication. BCV (HDP-CDV; CMX001) is an alkoxyalkyl derivative of CDV that has high oral bioavailability -it structurally resembles natural lipids so the compound can be more readily absorbed through the small intestine (7,8). As BCV is metabolised intracellularly, concentrations are reduced in the kidney, which is the site of doselimiting toxicity (9)(10)(11). BCV has not currently been approved for clinical treatment, except in the instance of compassionate use, which allows unapproved drugs to be used for a seriously ill patient when no other treatment options are available; BCV has been used in patients with CMV who have undergone allogenic haematopoietic cell transplantation (HCT) (12). However, BCV has received Fast Track status and Orphan Drug Designation from the FDA in June 2018 for treatment of smallpox (13). Tecovirimat (ST-246; TPOXX) is a low-molecular weight compound that is a potent and specific inhibitor of orthopoxvirus replication (14). As of July 2018, tecovirimat was approved as the first treatment for smallpox under FDA's Animal Rule (15).
To date, no systematic review has been completed on the potential efficacy of CDV, BCV and tecovirimat against smallpox. To address this gap in knowledge, this systematic review aimed to evaluate the existing research on antiviral efficacy against smallpox and other OPXV and provide a holistic understanding of their effect in vitro and in vivo animal studies, in human safety trials and reported human cases of OPXV infection.

Objectives
Four specific objectives were designed for each arm of the systematic review and are detailed below (Table 1).

Search strategy
A systematic review was conducted according to the Preferred Reporting Items for Systematic Review (PRISMA). Studies were identified by searching the electronic literature databases MEDLINE (1946-July 31 2018) and EMBASE (1974-August 1 2018) and hand-searching the reference lists of articles and reviews. The last search was run on 24 September 2018.
Initially, two reviewers (JY and SMR) conducted independent searches to reduce search bias of a single person conducting a search. The two reviewers then had a discussion to finalise the agreed upon search strategy, which allowed the keywords missed by one reviewer to be included. Two searches were conducted using a combination of Medical Subject Headings (MeSH) and text words (Appendix A). The first search (MEDLINE + EMBASE) aimed to identify studies involving in vitro, in vivo animal studies and human safety trials, using the following key words and their synonyms: orthopoxvirus, cowpox, ectromelia, monkeypox, smallpox, vaccinia, cidofovir, brincidofovir, tecovirimat. Only MEDLINE was used to conduct the second search, as EMBASE does not have a case report filter. This search aimed to identify cases of human OPXV infections where antivirals were used -results were limited to case reports, and only studies up to 1980 (year of eradication) were included.

Study selection and data extraction
We sought studies that presented quantitative data on the efficacy of antivirals against OPXV. After reviewing results from exploratory searches, it was decided that studies should be divided into four arms with separate eligibility criteria, summarised in Table  2. The types of studies included were: (1) in vitro studies, (2) in vivo animal model studies, (3) human clinical trials and (4) human case reports. Each search strategy is shown as an individual flow diagram, according to PRISMA (Figure 1).
Studies were screened for relevance by title and abstract. Two reviewers (JY and SMR) independently applied inclusion criteria to all identified and retrieved articles. One reviewer (JY) extracted data from included studies. Case reports, conference reports, reviews and letters to the editor were excluded. Selected papers were limited to the English language and articles in non-English language with English abstracts were excluded. Disagreements were resolved by consensus.
For in vitro studies, the chosen outcome measure was the effective concentration of the antiviral capable of inhibiting 50% of cytopathic effect (EC50) or 50% inhibitory concentration (IC50). These measures are used by studies to evaluate and compare different antivirals and identify their potential clinical effectiveness in humans. The EC50 and IC50 measures were also universal to identified studies and allowed for comparison between studies. Studies indicating antiviral efficacy as the main objective were identified as key papers. Studies that had original data, but used CDV, BCV or tecovirimat as reference values, were included but noted as supplementary. For in vivo studies, the outcome measured was impact of antivirals on mortality. Only lethal OPXV challenges were included, and results were grouped by animal model and inoculation route. For human safety or efficacy trials, any drug-related AEs were recorded. Finally, for human case studies, the antivirals were broadened to include VIG. All cases reporting use of these antivirals in any OPXV infection was recorded and the impact on disease progression noted.

In vitro
To assess the efficacy of antivirals (CDV, BCV, ST-246) on orthopox viral activity using 50% effective concentration (EC50) as an outcome measure.

In vivo
To assess the efficacy of antivirals in preventing mortality in animal studies compared to placebo.

Clinical safety trials
To assess the safety of antivirals in Phase I, II and III trials compared to placebo.

Human case reports
To summarise the use of antivirals in human cases of orthopoxvirus and their effect on disease progression.  Animal models for the study of orthopoxviruses As VARV is specifically human pathogenic, no one animal model can reproduce all the disease characteristics of VARV. Consequently, many models have been developed to mimic certain disease characteristics, with varying inoculation, OPXV and drug routes (16,17). Though large animal studies in non-human primates (NHP) are advantageous due to similarities with humans, they are limited by small sample size and cost. Therefore, many small animal models have been developed with focus on respiratory models (intranasal, aerosol and intratracheal inoculation) as the likely route of infection in a bioterrorism event (18).
VARV models using NHP are commonly used, but only produce mild generalised infection and rash (18). Aerosol inoculation requires very high viral doses (measured as plaque forming units (PFU)), and intravenous models manifest differently depending on PFU; 10 9 PFU (high) causes haemorrhagic VARV-like disease (almost 100% fatal and rare in humans), while 10 8 PFU (low) results in a 'lesional' model (though mortality is inconsistent) (18,19). As such, other animal models are often used. Only 2 studies used VARV model (16,19).
In cowpox (CPXV), aerosol and intranasal routes induce systemic smallpox-like disease. Aerosol produces more severe pulmonary disease, targeting the lower respiratory tract, while intranasal targets the upper respiratory tract due to the larger particles (18). In this review, 12 and 2 studies, respectively, were found on intranasal and aerosol CPXV models, and 1 study on both.
Vaccinia virus (VV) models have used mice or rabbits. Intranasal or aerosol inoculation in mice requires very high PFU to achieve lethality (10 4 -10 5 PFU) (18). Intranasal route produces haemorrhagic VARV-like lesions, and lethal infection in BALB/c mice requires higher PFU of Western Reserve (WR) vaccinia strain compared to C57BL/6 mice (18,20). SKH-1 hairless mice are used for dermal infections (18). Intradermal rabbit models more closely resemble disease compared to aerosol. However, antiviral efficacy has only been tested in mice via intranasal and intravenous routes, yielding 13 and 1 studies respectively. For rabbitpox virus (RPXV) models, aerosol and intradermal inoculation produce similar disease progression to human smallpox (18). This is a good model for airborne VARV transmission as infected rabbits can transmit airborne infection. Two and four studies were found on aerosol and intradermal inoculation, respectively, as well as 1 study on both. Ectromelia virus (ECTV) model shares many disease features with smallpox, and is conducted in mice, for which the knowledge of genetics and immunology is extensive (18). Disadvantages are that mice are naturally infected via the skin, and the major cause of death is liver disease. A commonly studied model is lethal intranasal ECTV in A/Ncr mice, which causes 100% mortality within 7-10 days. As humans are less susceptible to OPXV, a low dose intranasal infection of C57BL/6 mice (resulting in 60-80% mortality) may be a more suitable model (21). Further, ECTV modified with interleukin-4 (IL-4) gene is particularly useful as the virus is lethal to naturally resistant and vaccinated mice (22). This review found 7 studies for intranasal inoculation, 2 for aerosol and 1 for both.
Monkeypox virus (MPXV) is a zoonotic OPXV which is a public health concern in its own right; it is endemic in regions of Africa and epidemic in the US following importation of infected African animals (23). NHP MPXV models are well established, and 5 relevant studies were identified. Small-animal models are also useful; 5 papers describe African dormice, ground squirrels, prairie dogs, marmots and STAT1deficient C57BL/6 mice as highly susceptible and capable of producing human-like disease (23)(24)(25)(26).
Camelpox (CMLV) is most similar genetically to VARV (27). However, it has only recently been used in animal models as immunocompetent mice are naturally resistant (28). Immunodeficient athymic nude mice were found to be susceptible, establishing the first small animal model for this virus. Only 1 study was found (28).

Results
A total of 1010 studies from search strategies A and B were identified on the efficacy of CDV, BCV and tecovirimat including in vitro and in vivo animal studies, human clinical trials, and on human cases of OPXV infection. After removal of duplicates, non-English language and studies that did not test the chosen antivirals, 806 abstracts were reviewed. Of these, 230 full-text articles were reviewed and 158 articles met the inclusion criteria ( Figure 1).
The included studies were separated into 4 groups corresponding to each arm of the review: there were 51 in vitro studies, 56 in vivo, 15 containing both, 10 human clinical trials, and 26 human case reports.

In vitro findings
A total 66 studies tested the efficacy of CDV, BCV and/or tecovirimat in vitro. Of these, 22 studies assessed the antiviral drugs as the main objective (Table 3); the remaining 44 used the antivirals as reference drugs for testing models or novel drugs (Appendix B). CDV was by far the most studied, appearing in 57 studies. Comparatively, BCV was studied in 9 papers and tecovirimat in 12 papers.

In vivo findings in healthy animal studies
In this review, 71 studies tested the efficacy of CDV, BCV and/or tecovirimat in vivo animals against lethal challenges of OPXV. CDV appeared in 42 studies and was the most studied antiviral. There were 19 BCV and 20 tecovirimat studies. The most commonly used models were VV and ECTV virus. Results were grouped by route of virus inoculation; respiratory (intranasal, aerosol or intratracheal) and systemic (intradermal, subcutaneous, intravenous).

Cidofivir
CDV can be delivered intranasally, intraperitoneally, subcutaneously or via aerosol, and has been tested in various animal models against lethal doses of VV, CPXV, ECTV, rabbitpox virus (RPXV) and MPXV. Of the 42 studies on CDV, most were conducted in CPXV and VV models (Table 4).
Intraperitoneal CDV given prophylactically is protective up to 5 days prior to infection in both singleand multi-dose regimens (20,64,68). Delivered via aerosol (14C-cidofovir), CDV may be less nephrotoxic due to greater retention of drug in the lungs vs kidneys. An aerosol dose of 0.5-5mg/kg was highly efficacious (80-100%) up to 2 days prior to challenge, offering protection comparable to subcutaneous delivery (69).

Intranasal ectromelia model (3 studies)
A single dose of CDV in both BALB/c and A/NCr mice was protective up to 6 days and 3 days respectively (44,79,80).

Monkeypox model (2 studies)
Intranasally inoculated African dormice were significantly protected by a single dose of CDV (25). Another study found that in an intratracheal inoculation model, a 'humanised' dose 5mg/kg CDV was more protective than traditional vaccination (81).

Brincidofovir
BCV is delivered via oral gavage and has been tested in various animal models against lethal doses of CPXV, VV, RPXV, ECTV and MPXV. A total of 19 studies assessed BCV efficacy, the majority in ECTV or RPXV models (Table 5).

Brincidofovir efficacy in lethal orthopoxvirus respiratory challenges
Intranasal cowpox model (1 study) BCV given as single-or multi-dose regimens offers therapeutic protection efficacy up to 3 days p.i. (68). BCV was also protective when given prophylactically 1-5 days prior to infection.

Intranasal vaccinia model (3 studies)
In a VV-IHD challenge, single doses of BCV (25-100mg/kg) were protective against mortality (76). In comparison, lower doses (2.5-10mg/kg) were only weakly efficacious even if given for a duration of 5 days. BCV could be delayed to 2 days p.i. and protect against both WR and IHD strains (68, 84).
Aerosol rabbitpox model (1 study) BCV protected 2 of 3 mice when given as one, two or three 20mg/kg doses on observation of secondary lesions (85). Though sample size was small, this suggests BCV may offer some post-lesional protection.

Intranasal monkeypox model (1 study)
Only 1 study assessed this model using STAT1deficient C57BL/6 mice, which are particularly sensitive to MPXV (26). It found that 10mg/kg initiated immediately p.i. for a duration of 14 days could provide 100% protection. However, when mice were re-challenged on day 38 p.i., 20% succumbed to infection.

Tecovirimat
Tecovirimat is delivered via oral gavage and has been tested in various animal models against lethal doses of CPXV, VV, RPXV, ECTV, MPXV and VV. A total of 20 studies involved tecovirimat, the majority using MPXV and VV models ( Table 6).

Intranasal vaccinia model (3 studies)
Against WR and IHD strains, a 100mg/kg dose was fully protective when given immediately after infection for 14 days (44,94). Differences were noted in the minimum dosing duration between WR and IHD strains, which were 5 and 2 days respectively (46,94,95). Tecovirimat was still efficacious even when delayed 3 days p.i. (94).

Intranasal ectromelia model (3 studies)
Tecovirimat is highly efficacious, providing full protection even when delayed up to 5 days p.i. (44,46,88). Significant protection (73%) from mortality was still seen 6 days p.i., however lesions may not be a reliable marker for treatment initiation as they appear from day 7 (88).

Tecovirimat efficacy in lethal orthopoxvirus systemic challenges Intravenous variola model (2 studies)
In NHP, tecovirimat given at 300mg/kg was fully protective when initiated immediately or 1 day p.i. (16). At a dose of 10mg/kg, tecovirimat could be delayed up to 4 days p.i. (19).

Intravenous monkeypox model (2 studies)
Doses between 3-300mg/kg were highly protective up to 5 days p.i. if given for a duration of 14 days; though 3mg/kg was the minimum dose, 10mg/kg also reduced viremia and lesion count (16, 98-100). As lesions appear by 1 day p.i., results suggest tecovirimat can be given post-lesionally (99). Thus, the recommended human therapeutic dose is 400mg/kg, which would provide exposure levels comparable to 10mg/kg in NHP.

Subcutaneous monkeypox model (1 study)
In a ground squirrel model, tecovirimat treatment of 100mg/kg was fully protective up to 4 days p.i. (24).    Oral gavage 100mg/kg tecovirimat given 0 or 1 day p.i. for 5-14 days Dosing duration beyond 5 days does not seem to be an important factor. Tecovirimat initiated 1 day p.i. yielded better results than same day as challenge.

Resistance studies
Despite the potential of these antivirals, a major risk is the development of resistant OPXV (101). However, we found only 3 studies assessing antiviral efficacy against CDV-resistant strains.

Synergistic efficacy of BCV and tecovirimat
Combination therapy is an important consideration as it could reduce risk of developing treatment-resistant OPXV strains (22). Coadministration of BCV and tecovirimat were only discussed in 2 studies (Table 7).
In different dose combinations, BCV and tecovirimat coadministration consistently provided high levels of protection where monotherapy did not; no evidence of toxicity was observed (39,103). Significantly, lower doses of each antiviral are required in a co-administration regime, which minimises the risk of AEs without reducing the therapeutic effect. Coadministration therapy could also be delayed up to 6 days p.i. (39).
One study assessed BCV and tecovirimat coadministration against a model of vaccine resistance, using ECTV recombinant strain encoding murine IL-4 gene, which is lethal to immunised mice (103). Combination therapy protected 75% of mice vs. antiviral monotherapy which lead to complete fatality. Alone, tecovirimat at 3mg/kg (80%) and BCV at 1 and 3mg/kg (80% and 100% respectively) were highly protective. In combination, treatment was highly efficacious in all groups except for regime with lowest doses of both antivirals. No adverse reactions observed. Intranasal CPXV (BR) in BALB/c mice Oral gavage both tecovirimat + BCV 1, 3 or 10mg/kg tecovirimat and 0.3, 1 or 3mg/kg of BCV used individually or together beginning 3 days p.i. and continued for 5 days.
Alone, tecovirimat at 3 and 10 (67% and 87% respectively) and BCV at 3mg/kg (73%) were highly protective. All doses of either extended mean time to death. In combination, treatment was highly efficacious in 7 of 9 groups, including animals receiving 1mg/kg of both compounds. As protection was not offered for 1mg/kg of either antiviral alone, this indicates combination therapy could give improved efficacy. Intranasal CPXV (BR) in BALB/c mice Oral gavage both tecovirimat + BCV 1, 3 or 10mg/kg tecovirimat and 0.3, 1 or 3mg/kg of BCV used individually or together beginning 6 days p.i. and continued for 5 days.
Alone, neither tecovirimat or BCV significantly reduced mortality. In combination, treatment demonstrated efficacy in 3 of 9 groups, particularly in 2 of the 3. Like the above experiment, no mice survived when given these doses alone. Therefore, combination therapy provides synergistic efficacy against lethal CPX virus.

Chen et al., 2011 (104)
100mg/kg tecovirimat and 4mg/kg BCV used individually or together beginning immediately p.i. and continued for 14 days Alone, neither tecovirimat or BCV protected any mice from mortality. In combination, treatment was 75% protective against mortality.

In vivo findings in immunodeficient animal studies
In this review, 11 studies assessed antiviral efficacy in immunodeficient mice; 8 on CDV, 1 on BCV and 2 in tecovirimat (Table 8).

Cutaneous vaccinia model (5 studies)
Immunodeficiency can be modelled using SKH-1 hairless mice immunosuppressed with cyclophosphamide 1 day prior to infection. Topical CDV (1% cream) twice daily for 7 days protected 10-40% of these mice but was not efficacious when given in longer 92h intervals (105-107). CDV delivered intraperitoneally was not protective from mortality (106, 107). Though CDV (delivered topically or intraperitoneally) did not offer high efficacy, it consistently delayed time to death, and reduced primary lesion size and satellite lesion number. A triple therapy combination of 0.5% topical CDV, 50mg/kg peritoneal CDV and VIG was most efficacious in delaying time to death compared to mono-or double therapy combinations (107).
Another model used athymic nude mice, which lack a thymus and are therefore T cell deficient. Topical CDV (1% cream) was 75-100% protective up to 2 days p.i., which is prior to viral spread to organs (108). When treatment was initiated after onset of disseminated infection (approx. day 15), a subcutaneous dose of 100mg/kg CDV for minimum 3 weeks protected 80-100% of mice.
Severe combined immunodeficiency (SCID) mice lack both B and T cells. In this model, all mice succumbed to VV, however mean time to death was significantly extended (20, 109).

Intranasal cowpox model (1 study)
CDV dose of 100mg/kg (that was protective in immunocompetent mice) could not protect SCID mice even with repeated therapy (64).

Intraperitoneal cowpox model (1 study)
CDV doses between 2.2-20mg/kg could not protect SCID mice (20). However, treatment delayed time to death and reduced viral organ replication.

Intranasal camelpox model (1 study)
A 100mg/kg CDV dose delivered immediately p.i. for 3 days could provide full protection in athymic nude mice (28).

Brincidofovir
Only 1 study assessed BCV efficacy in an immunodeficient animal model. To model severe immunodeficiency, BALB/c mice lacking T cells were challenged with VV-IHD. Although all mice succumbed to disease, time to death was significantly delayed (84).
Less severe immunodeficiencies were modelled by partially reconstituting mice with T cells from healthy mice 1 day prior to infection (84). BCV was significantly protective (57-100%) and facilitated the development of strong adaptive immune responses that protected mice from re-challenge without further treatment.

Tecovirimat
In this review, 2 studies assessed the efficacy of tecovirimat in an intranasal lethal vaccinia model. In Nude and SCID mice, tecovirimat was not protective, but could delay disease progression (110). In BALB/c mice lacking CD4+ or CD8+ T cells, 100mg/kg of tecovirimat was 100% protective when administered up to 3 days p.i.(110). In the same experiment, Jh mice (genetic condition causing lack of B cells) with or without additional CD4+, CD8+ or CD4+ and CD8+ T cell depletion were modelled. 100mg/kg of tecovirimat was 100% protective when administered immediately p.i. for Jh, Jh/CD4-and Jh/CD8-mice. When administered 3 days p.i., Jh/CD8-and Jh/CD4-mice were protected 100 and 80% respectively; Jh mice all succumbed to disease. Jh mice lacking both CD4+ and CD8+ T cells did not survive any treatment.
In Nude BALB/c mice, tecovirimat is not protective unless mice are reconstituted with T cells, in which case full protection was conferred (95).

In vivo synergistic treatment with antiviral and vaccination
The antivirals CDV, BCV and tecovirimat do not inhibit the development of protective immunity when co-administered with vaccination (Table 9).

Cidofovir
Synergistic effect of CDV and vaccination were tested by 2 studies. In an intranasal ectromelia model with immunocompetent BALB/c mice, CDV and vaccination (Lister and ACAM3000) were shown to demonstrate synergistic efficacy (79). This was efficacious even when the regime was given preexposure to ECTV, and up to 4 days post-exposure. In contrast, in an NHP monkeypox model, 1 study found that a single dose of CDV and Dryvax coadministration significantly reduced vaccine-related immune responses (112). Though coadministration regime was still efficacious compared to naïve controls, it resulted in a higher lesion count and reduced survival rates compared to Dryvax alone; CDV's ability to inhibit viral replication appears to compromise Dryvaxinduced immunity.

Brincidofovir
Only 1 study assessed BCV and vaccination; using an intranasal ectromelia model with immunocompetent A and C57BL/6 mice, the study found that BCV could be co-administered with Dryvax, ACAM2000 and ACAM3000, reducing the severity of vaccination-related lesions without preventing the development of protective immunity (87). This was supported by comparing results of ACAM2000 and ACAM3000, which are replicating and nonreplicating vaccines respectively, which indicated that BCV's mechanism of action is likely through limiting viral replication rather than inhibition of the immune system.

Tecovirimat
Efficacy of tecovirimat and vaccination was assessed by 3 studies. Coadministration with ACAM2000 was tested in healthy cynomolgus macaques in an intravenous monkeypox model (100). Where ACAM2000 given alone was not efficacious, all animals treated with tecovirimat, with or without ACAM2000 were fully protected from initial challenge, and re-challenge 2 months later. Further, tecovirimat does not inhibit the development of shortand long-term protective immunity in a lethal intranasal vaccinia model against BALB/c mice (113). Tecovirimat was shown to reduce the severity of lesion formation in vaccination with VV (WR) but did not affect the formation of less severe lesions from Dryvax vaccination. This indicates that tecovirimat coadministered with vaccination will not inhibit the "take" lesion, used as evidence of vaccine protection.
To assess the prospect of tecovirimat coadministration with ACAM2000 for immunodeficient individuals, BALB/c and B cell deficient (JH-KO) mice with varying degrees of T cell deficiency were challenged in a lethal intranasal vaccinia model (114). In these studies, mice treated with coadministration achieved similar survival rates to mice with vaccination alone, indicating that tecovirimat does not impair development of short or long-term protective immunity. Tecovirimat reduced the severity of vaccination lesions in all mice except those lacking both CD4-and CD8-T cells.

Human trials
In this review, 9 studies reported on human trials of BCV and tecovirimat safety (Table 10). One review, Lanier et al. was also included as it contained information of trials not found in an individual paper.

Brincidofovir
BCV has been studied in Phase I, II and III human trials for the prophylaxis and treatment of various dsDNA viral infections including smallpox, prophylaxis/pre-emption of cytomegalovirus disease in human stem cell transplant (HSCT) recipients, and pre-emption treatment of adenovirus disease in paediatric HSCT recipients (115). These results are useful in supporting BCV as a treatment candidate for smallpox; the recommended dose is 200mg (22,116).
Two Phase I trials indicate BCV is well tolerated in both adults and children (116). The most common AEs were gastrointestinal, usually diarrhoea. Laboratory AE of elevated serum transaminases was the main reason for treatment discontinuation, though elevations were later found to be non-symptomatic and transient.
BCV was also tested in immunocompromised and haematopoietic cell transplant recipients in Phase II and III trials to prevent/treat cytomegalovirus and adenovirus infections (115-120). Though an increased frequency of AEs were seen, this must be considered relative to the comorbidities in this population. The 200mg once weekly (QW) dose had fewer AEs relative to the 200mg twice weekly (BIW) dose (116, 119). Acute graft-versus-host disease (aGVHD) was an AE specific to this population and lead to death (2.3% vs. 1.9% in placebo) (116, 117). Overall, BCV is safe and well tolerated in the general population, including children and immunosuppressed groups. No doselimiting toxicity has been observed, and humans have been tested with doses higher than that suggested for treating smallpox (115).

Tecovirimat
Tecovirimat has recently been approved as the first drug for smallpox treatment (15). Three Phase I trials demonstrated that tecovirimat is generally safe and well tolerated (121-123). No serious adverse events (SAEs) were observed and the most common drugrelated AE was headache. Across all subjects, only 1 subject withdrew from drug-related AE (headache), and they were in a high dose group of 800mg/day (123). Absorption was faster in non-fasting volunteers and form I was chosen to be used in further treatment (121, 122).
One Phase II trial was completed in a generally healthy population with mild comorbidities (124). There were no SAEs or deaths, but 44.9% of subjects reported at least 1 AE, which were mild and commonly headache or nausea. Withdrawals from the study were not drug-related.
One expanded safety Phase III trial was done in healthy volunteers (98). The dose tested (600mg twice daily for 14 days) provided greater exposure than that considered efficacious. 19.8% of subjects experienced drug-related AE, commonly headache, osteoarthritis and hidradenitis. One death occurred, however was not deemed drug-related; pulmonary embolism was reported in the patient with significant history 1 week after treatment completed.

Human case studies
This review found 26 human cases of OPXV infection treated with antivirals since 1980 (Table 11). Humans can become infected through several routes: contact with infected animal vectors (cats or rats), tampered vaccinia-rabies baits or military vaccination against smallpox causing adverse reaction or transmission to immediate contact (125). In healthy humans, OPXV infections are usually mild; it is only on rare, serious occasions where antivirals may be used. Systemic OPXV infections are treated with CDV, BCV, tecovirimat or VIG, and ocular infection is treated with CDV or trifluride drops.
CDV was administered in 3 cases; a dose of 5mg/kg in a baby with eczema vaccinatum contributed to improvement (126, 127). CDV appears to lack effectiveness in ocular CPXV infection and when the patient is severely immunosuppressed (128, 129).   BCV and tecovirimat were used in 2 and 3 cases respectively. In a case of severe immunosuppression, BCV was not able to provide protection (129). Tecovirimat contributed to the successful resolution of symptoms in all 3 cases (126, 127, 130, 131). BCV, oral and topical tecovirimat were used together in a case of progressive vaccinia in a military vaccinee who had unknown underlying acute myelogenous leukaemia (AML)(130). After several months of antiviral treatment and chemotherapy, the man recovered. However, tecovirimat-resistant VV was detected late in disease, indicating that BCV may have played an important role in recovery.
VIG has been used in 22 human cases and appears to demonstrate some protective effect (126, 127, 129-144). However, there is limited supply as it must be synthesised from blood drawn from smallpox vaccinees. The above case used 241 vials of intravenous VIG (VIGIV), which placed unanticipated strain on the US national stockpile (130). Topical trifluride was used in 9 cases, all of which successfully resolved (126, 127, 134, 136, 137, 143, 145).

Discussion
Though re-emergence of smallpox is hypothetical, there is an imperative for continued research as the consequences would be disastrous. Since its eradication, many compounds have been considered as anti-smallpox agents with varying, but limited, levels of efficacy. They include methisazone, M&B7714, cytosine arabinoside (ara-C), adenine arabinoside, ribavirin, CSA-13 cera-genin, imiquimod, idoxuridine, interferon and phosphonoacetic acid (27,50,152,153).
CDV, BCV and tecovirimat are considered the most viable antivirals in the event of smallpox re-emergence or vaccine AEs. This systematic review reviewed 230 articles on their efficacy in vitro, in vivo animal studies, in healthy humans and in human case reports to provide a holistic understanding of their potential use.
In vitro, CDV demonstrated consistently high potency; however, it is limited by its poor bioavailability and nephrotoxicity when administered intravenously. BCV, a bioavailable derivate of CDV, is both safer and more efficacious in vitro. Likewise, tecovirimat is also more efficacious than CDV and demonstrates specific activity against multiple VARV and MPXV clades, the two OPXV of greatest concern to human health (44).
In vivo studies in various animal models support the use of these antivirals therapeutically. Both BCV and tecovirimat were efficacious when given in singleand multi-dose regimens and were efficacious in most animal models when delayed several days p.i.. One model suggested BCV treatment could be initiated after observation of secondary lesions, though there was a small sample size of animals and further study is required to substantiate this (85). In immunodeficiency studies, BCV provided partial protection to mice with moderate immunodeficiency (57-100% survived). However, it could only extend time to death in mice with severe immunodeficiency (84). Tecovirimat was protective in moderately immunocompromised mice but could only delay death when mice lacked both B and T cell immunity (95, 110).
CDV and BCV demonstrate strong potential for prophylactic therapy and both were shown to be efficacious when given up to 5 days prior to lethal challenge, depending on the animal model (20,64,68). No studies assessed prophylactic effect of tecovirimat despite its recent FDA approval, which is a significant gap that should be addressed.
BCV and tecovirimat has been shown to be safe and well tolerated in both adult and paediatric populations in Phase I, II and III trials. Most common BCV-related AEs were gastrointestinal (diarrhoea or nausea); tecovirimat-related AEs were neurological (headache) and gastrointestinal (diarrhoea or nausea) (98, 115-117, 119, 120, 122-124). These AEs were in a dosedependent relationship and were mild at recommended therapeutic doses. Tecovirimat has been successfully approved under FDA Animal Rule and is available as 200mg capsules, of which 2 million courses have been delivered to the Strategic National Stockpile. It is now undergoing Phase I development for IV formulation (15).
Though these antivirals demonstrate promise, a major limitation is the potential for antiviral-resistant strains of OPXV, particularly in a bioterrorism context. This is already possible through selective cell culture in the lab. More concerningly, tecovirimatresistant VV was detected in a human case of progressive vaccinia after tecovirimat treatment (130). Viral DNA can also be manipulated via synthetic biology. BCV demonstrates a high barrier to resistance, but only a few mutations are required at the F13L gene for VV to become tecovirimat-resistant. Research into antiviral efficacy on resistant OPXV strains is very limited. Only CDV-resistant OPXV was investigated; CDV was weakly protective, while BCV was partially protective (55, 59, 102). However, the studies are not conclusive, and given the likelihood of tecovirimat-resistance, more research needs to be done in this area.
A proposed way to reduce the risk of antiviral resistance is through combination therapy. BCV and tecovirimat have strong synergistic efficacy due to their differing mechanisms of action and provide protection against lethal challenge when antivirals used alone could not (39,103). This protection also extended to an ECTV model of vaccination resistance, suggesting combination therapy may be effective against more virulent stains of OPXV. However, data is limited to two studies in this review and therefore a definitive conclusion cannot be made. There is a need for further research into the promising results in this area.

Conclusion
The achievements in antiviral research for OPXV treatment has greatly changed the landscape of bioterrorism preparedness post-smallpox eradication. Use of antivirals could alleviate the risks of vaccination and extend protection to immunocompromised populations in the event of a smallpox outbreak. Future research should look beyond antiviral monotherapy, as the limited research on combination therapy is promising. Given that antivirals would provide the most benefit for immunodeficient populations, more focus should be given to developing relevant models. Finally, with the risk of antiviralresistance, more robust models to test antiviral efficacy against more virulent strains should be developed.

Introduction
Smallpox was eradicated in 1980, but remains a category A bioterrorism agent [1].The only official stocks of the virus are in the United States and Russia [2], but unofficial stocks may be present elsewhere. The variola genome is fully sequenced and advances in synthetic biology have increased the likelihood of smallpox being synthesized in a laboratory [3]. Experts have previously dismissed the threat of synthetic smallpox as unlikely, but were proven wrong when in 2017 Canadian researchers synthesized an extinct pox virus and published the methods in an open access journal [4]. Smallpox or a variant thereof may re-emerge from bioterrorism or a laboratory accident [5], thus is a high priority for preparedness planning [6]. Due to ageing, advances in medical therapies, transplantation and people living with immunosuppressive conditions such as HIV, the immunological status of the population has also changed dramatically in the decades since eradication of smallpox, with almost one in five people living with immunosuppression in developed country settings [7].
A large proportion of the population today is unvaccinated and residual immunity in cohorts who were vaccinated prior to 1980 is waning [8][9][10]. The World Health Organization has a stockpile of 33.7 million doses of mostly second generation ACAM2000, but also some first-generation vaccine. The majority of the stockpile is pledged from member countries, with only 2.7 million doses physically held by WHO [11].
In low income countries, weak health systems and shortages of human resources for health predict a more severe impact of serious epidemics [12]. In a risk analysis model for Ebola, we showed that country factors, such as gross domestic product and ratio of physicians to population, combined with diseasespecific factors can predict catastrophic epidemic impacts and the need for urgent intervention [12]. The Pacific is a uniquely vulnerable region because of widely dispersed islands, geographical and population diversity between nations, natural disasters and extensive informal maritime transport routes which can transmit infectious diseases [13]. The region also suffers a crisis in human resources for health because of migration of skilled health workers and limited production of qualified health workers. For all these reasons, a smallpox attack in the Pacific may be difficult to control. The wider Asia-Pacific region contains highly populous low-income countries with mega-cities. Epidemics arising in Asian mega-cities may similarly be difficult to control, with greater potential for spread due to high population densities [14].
On August 16 th 2018 we held Exercise Mataika to test preparedness for a worst-case scenario of a smallpox attack which begins in the Pacific and is followed by a larger scale attack in a highly populous Asian country [15]. The purpose of exercising a worstcase/severe scenario was to identify key vulnerabilities that can be mitigated or prevented. The exercise was underpinned by mathematical modelling which aimed to determine the duration and magnitude of the epidemic under different scenarios, the critical threshold for epidemic control, and scenarios where the current stockpile of vaccine is inadequate. This paper describes the modelling underpinning the exercise.

Aims
We aimed to determine the influence of disease control measures (case isolation, contact tracing and vaccination) on epidemic control. We also aimed to determine the adequacy of the current global smallpox vaccine stockpile and the duration and magnitude of the epidemic under different scenarios. The objective was to inform preparedness and prioritise planning to avoid a worst-case scenario.

Methods
We constructed a modified SEIR model for smallpox transmission based on our published model (7). We assumed that the virus was not genetically modified and that there is minimal vaccine-induced residual immunity in the world [7]. We assumed an attack in an airport in a crowded city in a de-identified, highly populous Asian country, starting the epidemic with 10,000 infected. Case fatality rates are based on expected distribution of hemorrhagic, flat, ordinary and modified smallpox [7].

Interventions
We considered case isolation, contact tracing and ring vaccination (the combined intervention of contact tracing and vaccination of contacts of cases) as the key interventions for pandemic control. Only the United States has a stockpile of antivirals sufficient for the entire population, so in most cases, a ring vaccination strategy would be more feasible to ensure a rapid response. Antivirals, where available, would be given after diagnosis and isolation, so we assumed they would not add to epidemic control above the effect of isolation alone. However, antivirals would likely reduce morbidity and mortality for treated cases.
Proper case isolation in suitable facilities was assumed to stop transmission of smallpox, as observed during the period of endemic smallpox [2].
Contact tracing and vaccination were assumed to be a combined intervention, with close contacts of cases traced, vaccinated and monitored for development of disease. We assumed 95% vaccine efficacy [2,[16][17][18]. We tested the impact of varying the following parameters on the epidemic: 1. Size of the initial attack (50, 100, 1000, 10,000

Mathematical Model
The SEIR model uses ordinary differential equations, to transfer people between epidemiological states related to their smallpox infectious status. For age-specific force of infection, we used Euler's approximation to make discrete contact rates using data from the contact matrix for the specific affected Asian country (which remains anonymous for this paper) for 100 days. We simulated an outbreak starting in this highly populated country using the projected age-specific contact matrix for that country [19]. However, in order to simulate the infection spreading globally, after 100 days we changed the contact matrix to better represent the entire world population contact patterns. To do this, we estimated the world contact matrix [19] as an average between the rates from the 10 highest most populated countries [20].
Residual smallpox vaccine immunity in the population was based on our previous estimates [7,21,22]. We multiplied the force of infection by a parameter (α1, α2, α3, α4) to account for different population susceptibility levels. Disease parameters and their estimation, as well as different grades of infectivity, were estimated as previously described [7]. Table 1 shows the model parameters.

Sensitivity analysis
We explored the influence of attack scenarios of 50, 100, 1000 and 10000 initial infected to determine the impact on the epidemic. The base case was a worstcase scenario of 10000 infected. Delays in diagnosis and time to obtaining laboratory confirmation could vary the time of onset of the response. We therefore varied the time of the response commencing between T=20, 30-and 40-days following virus release. Given an average incubation period of 12 days for smallpox [2], this corresponds to day 3, 8 and 18 after the onset of symptoms of the index case.  Average number of contacts per case 11 [21] Proportion of contacts traced around an infected case 90%; Sensitivity analysis with 70%, 50% and 30% [2] Proportion of cases that get isolated once infected and symptomatic 90%; Sensitivity analysis on with 70%, 50% and 30% [2] Time of starting intervention At day 20, 30 and 40 after release, corresponding to 8, 18 and 28 days after the onset of symptoms of the index case, using an average incubation period of 12 days. [2] Population of the world 7383,008,820 [22] Initial infected 10,000 Sensitivity 50, 100, 1000 Vaccine efficacy 95% [18] Efficacy of case isolation in preventing transmission 100% [2] Given that a smallpox epidemic in a low-income country would be complicated by weak health systems and low human resources for health, a sensitivity analysis was conducted on the proportion of infectious cases isolated and contacts traced and vaccinated ("ring vaccination").
To test which of these interventions is more influential, we fixed one at 90% and varied the other between 30, 50, 70 and 90% respectively. Given a large number of possible combinations of these proportions between the two interventions of case isolation and ring vaccination, we tested 30% (30/30), 50% (50/50), 70% (70/70) and 90% (90/90) for each of case isolation and ring vaccination to determine the impact on epidemic size and duration of differing completeness and capacity of the public health response. Time to the end of the epidemic was modelled. If the modelled epidemic did not end within 11 years (4000 days), we assumed smallpox would become endemic. We also estimated the amount of vaccine required in the differing scenarios above (based on the number of contacts to be traced and vaccinated in a ring vaccination strategy), in order to identify scenarios where the WHO stockpile is adequate or inadequate to meet disease control needs.

Critical epidemic control threshold
The modelling above indicated that prospect of early epidemic control was lost at the 50/50 level of contact tracing and case isolation. We therefore ran the model at varying levels between 50-60% to identify the threshold value below which epidemic control is lost. Figure 1 shows the epidemic without interventions for varying attack sizes from 50-10,000. Figure 2 shows the influence of time to commencing the response varying between 20, 30 and 40 days from the initial attack, with a larger epidemic resulting from each 10 days of delay. Figure 3 shows the time to epidemic control by varying rates of case isolation and ring vaccination. If ring vaccination and isolation rates of 70% or higher each can be achieved, the epidemic will end in less than a year. If rates are 50% each, the epidemic will continue for more than 9 years. At rates less than 50% each, the modelled epidemic does not end within a decade, and smallpox becomes endemic. Supplementary Table 1 shows the matrix of modelled outputs for combinations of ring vaccination and case isolation rates by response time. Greyed out cells indicate that the epidemic did not end within 10 years in these scenarios. If ring vaccination rates are below 50%, for any prospect of epidemic control, case isolation rates must be at least 90%. We tested equal proportions of case isolation and ring vaccination in each scenario, but various permutations are shown in supplementary table 1. Supplementary figures 1 and 2 show the influence of varying rates of ring vaccination on the epidemic, keeping case isolation rates constant at 90%. Figure 4 shows the influence of varying rates of case isolation rates on the epidemic, keeping the ring vaccination rate constant at 90%. Whilst both measures are effective, case isolation is more influential than ring vaccination.   In Figure 3, we show that epidemic control is lost at the 50% level, so we tested values between 50-60% to identify the threshold below which epidemic control is lost. This was identified as 53%. Figure 4 shows model outputs at values between 53-56% and shows that the critical epidemic control threshold is >53% each of case isolation and ring vaccination. If the proportions are 53% or less, epidemic control is lost. Figure 5 shows the vaccine dose requirements for epidemic control at varying proportions of case isolation and ring vaccination. The stockpile is adequate up to 54% each of case isolation and ring vaccination, provided the response commences within 30 days of the attack. If it is delayed to 40 days, the stockpile is exceeded, and 57,299,000 doses are required. If, however, the epidemic threshold is crossed and rates fall to 50% case isolation and ring vaccination, over a billion doses of vaccine will be required to achieve epidemic control.

Discussion
In the case of a large-scale smallpox attack in a low income, highly populous country, high rates of case isolation and ring vaccination are not guaranteed. Weak health systems and lack of human resources for health may create the circumstances for a catastrophic epidemic, which would occur if case isolation and ring vaccination rates of at least 53% each are not achieved. In this case the epidemic could persist for a decade or longer, or smallpox may even become endemic, as shown by our model. Influential factors on epidemic size are the size of the initial attack, time to commencing the response, case isolation rates and ring vaccination rates. Whilst the size of an attack may not be within our control, other influential factors are modifiable and potentially within our control. An outbreak of smallpox can be controlled with a rapid response and with high rates of case isolation and high rates of ring vaccination. The latter depends on exhaustive contact tracing, which requires a high investment in human resources, given each case on average has at least 10 contacts [21]. However, if the response is delayed to 30 days or longer from the time of the initial attack (which in practice equates to 18 days after the first symptoms occur), or if the attack infects 10000 people or more, epidemic control will be much more challenging. Rapid response time is critical, especially in the case of a large attack. Delaying the response to greater than 20 days from the virus release (which means commencing the public health response within 7 days of symptom onset, given 12 days of incubation) will result in a more severe outbreak. Whether it is feasible to commence response within the best-case scenario of 7 days after symptom onset is unknown, but unlikely, particularly when a global stockpile of vaccine pledged from donor countries needs to be deployed through a specified WHO process [23,24]. Low income countries are at risk of more severe epidemics due to lack of resources for case isolation, contact tracing and treatment. In addition, vaccination and protection of first responder teams will add some delay to deployment for the epidemic response. A rapid response also depends on rapid diagnosis of smallpox, but clinicians are unfamiliar with the disease and may not recognise it. In fact, the last outbreak in Europe was characterised by failure to diagnose a single index case, resulting in a large outbreak in Yugoslavia [25].
Recently, the clinical diagnosis of serious emerging infectious diseases has been missed by emergency clinicians, including Ebola in Nigeria and the US, both of which occurred during a peak of media reporting of the West African epidemic, when awareness should have been high [26,27]. A similar failure was seen with MERS Coronavirus in South Korea, with repeatedly missed diagnosis at multiple hospitals [28].
Better syndromic surveillance, point of care tests and triage protocols for high consequence outbreaks such as smallpox would help prevent a worst-case scenario. However, rapid diagnostics are useful only if the clinical diagnosis is suspected and triggers testing. Pre-vaccinating teams for emergency response would also reduce avoidable delay. In the US, following 9/11, large scale smallpox vaccination of first responders commenced but was ceased due to adverse events [29]. Given the likelihood of a smallpox epidemic is unknown, a suitable option in the non-epidemic period would be to vaccinate small teams of first responders with third generation, non-replicating vaccines, thereby reducing the risk of adverse events and improving the ability to commence a response rapidly.
Other areas to reduce delay could involve preplanned and pre-designated facilities for isolation of cases and surge capacity for contact tracing. Epidemic control is sensitive to both ring vaccination and case isolation rates, which need to be maintained at high levels. Having plans for rapidly deployable physical space and human resources to ensure rapid and thorough case finding, isolation, contact tracing, vaccination and quarantine are key for preparedness planning. Clinical and public health workforce requirements should be estimated and surge capacity planned for.
This may necessitate the use of community volunteers, especially for contact tracing, as there will be at least an order of magnitude higher in contact numbers compared to cases [21]. During eradication of smallpox, community volunteers were provided financial incentives to assist with case finding [30]. Plans for incentivising community members should be considered as part of pandemic planning, given the importance of a rapid response. A global response is also required, to ensure resources are directed to areas of the greatest epidemic intensity rather than being retained in high income countries with low epidemic intensity.
In Exercise Mataika, we considered a worst-case scenario to identify critical weak points, prioritise the most influential factors and then plan how severe impacts can be avoided.
This research provides a framework for disease control targets in the event of a smallpox epidemic. It illustrates the importance of not allowing control measures to fall below the threshold for epidemic control, which appears to be 53%. In low income countries, a smallpox epidemic could overwhelm the health system and far exceed human resource capacity, so that low rates of case isolation and contact tracing modelled in this study are a realistic possibility. The consequences of poor epidemic control are catastrophic if only 50% of cases are isolated and 50% of contacts traced and vaccinated. Not only does the duration of the epidemic blow out to a decade or longer, but smallpox may become endemic and vaccination requirements will far exceed the available WHO stockpile. In the less severe scenarios, the WHO vaccine stockpile is adequate for epidemic control, but stockpiling provides a limited duration of supply, and the epidemic may run for 3000 days or more, depending on the scenario. Vaccine manufacturing capacity in the world is limited and the time lag for scaled up production is 12-18 months. Available vaccine could be diluted in the event of a shortage [31]. If vaccine is not available, all efforts must focus on case isolation, contact tracing and surveillance as effective interventions for epidemic control. However, these require planning for surge capacity in physical space and human resources, as well as global coordination of response in the most severely affected areas. Other modifiable factors which are influential, such as rapid response, must be factored into planning to ensure a worst-case scenario is avoided. Training and capacity building, as well as pre-vaccinated teams, can also assist with rapid response, as every day of delay worsens epidemic control. Global cooperation is also critical to this planning, so that resources are directed quickly to affected areas.

Supplementary Information
Supplemental information as referenced in the text (PDF).

The 2008 Queensland Dengue Fever Outbreak
The idea that the Earth has moved from the Holocene into a subset of such warming period more appropriately termed the Anthropocene was first proposed in 2002 by Crutzen. Crutzen's basic idea was that increased numbers of people using more land for agriculture and urban dwelling, depositing their waste into land, sea and air, depleting the fixed resource of the natural environment, were having a deleterious influence on terrestrial, coastal and maritime ecosystems, atmospheric gas composition, nitrogen, carbon and phosphorus cycles, basic climatic parameters, food chains, biological diversity and natural resources (1). Increasingly it has become apparent, however, that the dominant portion of these potentially catastrophic impacts, particularly those that continue to increase despite scientific evidence of their harm to other species and ecosystems as well as human health, are driven by the present legally required dominant focus on profit-seeking by multinational corporations; hence introduction of the concept of 'corporatogenic' climate change and the substitution of 'Corporatocene' for 'Anthropocene' (2). A recent work of speculative fiction evaluates the possibility that an Ecocide Truth and Reconciliation Court may eventually hear cases brought against such miscreant corporate stakeholders in Earth's ecosystems (3).
One of the major deleterious biosecurity impacts of corporatogenic climate change is likely to be the spread to hitherto temperate regions of vector-bourne infectious disease. Dengue is a viral infectious disease spread by the mosquito Aedes aegypti. Approximately 50-100 million people are infected with the dengue virus each year, suffering significant morbidity and mortality; causing the World Health Organization to designate it a major international public health problem (4).
In 2008, several geographic areas of Queensland, including Townsville, Port Douglas, Mossman and Cairns, were severely impacted by an outbreak of Dengue Fever (5). While Dengue Fever has been comparatively rare in Queensland, the temporal conditions at the time, a combination of warm and humid with rainfall, created the perfect environment for the relatively large scale spread of the disease by creating an ideal breeding ground for unintentionally imported vector mosquito. Dengue Fever is generally manageable, with many infected patients advised to rest and stay hydrated. It is only in rare and severe cases that a patient is admitted to hospital for recovery from the disease. In some circumstances, Dengue Fever may result in long term disability and lethargy as well as haemorrhagic fever and fatal Dengue Shock Syndrome (6).
The epidemic lasted until May of 2009 and was one of the most severe Dengue Fever epidemics to plague Australia. While only one death was confirmed to have resulted from the outbreak, the health implication on the Australian public were far from trivial. The rapid spread of the Dengue Fever and an incubation period which made the disease undetectable or latent within the infected individual, limited the effectiveness of preventing spread of the disease by restricting mosquito breeding and the social interactions of infected individuals, as well as potentially infected individuals, including from donating blood to the Australian Red Cross for use in blood transfusions (7). Accordingly, the donor pool providing supply to Queensland blood banks is likely to become limited during such an outbreak of Dengue Fever (8). This results in less eligible donors being able to legally give blood, which in turn burdens the health system by creating a depletion on blood resources to be used to administer to sick patients suffering from unrelated illness or injury.
During the 2008 Dengue Fever Outbreak, the depletion of Queensland bloody supply became a critical health issue and forced the Queensland state government to make pleas to members of the public residing in non-infected areas of Queensland and even interstate to donate blood if eligible. In January 2009, during the early stages of the epidemic, the Red Cross confirmed that over 14% of their regular state blood supply had been lost to restrictions imposed on the eligibility of regular donors from giving blood if they had been in contact with infected areas of persons, with the ban remaining in place until there had been no new recorded outbreaks within the last 3 months (9). The public were well-informed (i.e., by media, SMS-texting and door-knocking) to implement preventative strategies such as removal of waterholding containers, use of pyrethroid indoor surface sprays (delivered by the Queensland State Emergency Service) and expert treatment of larger water-holding containers with s-methoprene pellets. Companies involved in the tourism, transport and media industries had their investments restricted by these public health regulations. This Dengue Fever outbreak consisted of 931 confirmed cases and one death, at a cost to Queensland Health of approximately $AUS 3 million (10). It was not the first epidemic of Dengue Fever in Queensland.
In Australia, such recent outbreaks of Dengue Fever in the northerly regions are an outcome of corporatogenic climate change-initiated increased temperatures (11). Studies monitoring the spread of Dengue Fever in Queensland, for instance, demonstrate that the climatic conditions associated with Queensland's tropical climate, including higher temperatures, increased rainfall and higher humidity, create an ideal environment for the spread of Dengue Fever by mosquitos once it has been introduced to the area (12). The years of 2000 to 2009 were the hottest years compared to any previous decade in Queensland, with more than a 0.5-degree Celsius increase in average temperatures in comparison to the recorded average temperature from 1961 to 1990 (13). Over the next 35 years, predictions in climate change reveal Queensland will suffer increased temperatures of up to 2.2 degrees Celsius and increased periods of tropical rainfall (13). Mosquitoes, the vector for the disease of Dengue Fever thrive in warm wet environments, such as those that characterised the period of 2000 to 2008 in Queensland (14). With the climate getting wetter and hotter in Northern Queensland, the occurrence of mosquitos carrying the disease will increase (14). Some reports predict that with a 1-degree Celsius increase in average temperature and a 1mm increase in average monthly rainfall, at least a 6% increase in the prevalence of Dengue Fever in Queensland is likely to occur (15). Some public health studies show predictions of a 61% to 93% reduction in the Queensland blood supply due to Dengue infections by the end of the current century if the climate continues to warm at predicted rates (16).
In the incipient era of corporatogenic climate change air and boat travellers to Australia will be at higher high risk of carrying infectious diseases, such as Dengue Fever, from the increasing number of countries where it is endemic (17). Each year there are about ten million short term departures by Australian residents to Dengue Fever-plagued areas. Amongst such travellers reported incidence rates of Dengue Fever are higher than the incidence of other travelrelated diseases (17). A major risk here is entry of the vector mosquito via quarantine breaches by legal and illegal international vessels arriving at Australian ports or other mainland sites, as well as mosquito eggs in cargo transported inland. This was probably the cause of the 2006 A. aegypti incursion at Groote Eylandt and the spread Dengue Fever to Tennant Creek, NT, in 2004 (18). So how do Australian Federal and State governments respond to such challenges, particularly given that much of this climate warming is driven by widespread human usage of carbon-rich (oil, coal and natural gas) fuels, as promoted by multinational corporations in those industries? The question is complicated by the significant amount those corporations donate to political parties, their capacity to hire lobbyists, the various regulatory carbon-lock-in tax subsidies and incentives they current enjoy and, as we shall see, the emergence of new trade and investment-based rights in multinational corporations to challenge legislation claimed to indirectly expropriate their investments.

Australian Quarantine and Public Health Regulation for Preventing Disease Epidemics
Present-day prevention, control and treatment of vector-bourne disease, such as Dengue Fever, is coordinated by various State and Commonwealth legislation and guidelines. In Queensland, as an illustrative example, the current relevant state legislation for preventing the spread of disease such as Dengue Fever is the Public Health Act 2005 (PHA) ( "likely to be, hazardous to human health, or that contributes to, or is likely to contribute to, disease in humans or the transmission of an infectious condition to humans" (21).
Mosquitos, as the vector for Dengue Fever, are a public health risk as described in the PHA and their due notification would enable the legal implementation of either program via s. 36 of the PHA in relation to disease control.
There is no effective vaccine registered in Australia for Dengue Fever. Prevention focuses on limiting mosquito breeding, restricting human exposure to mosquitos and rapid symptomatic and test proven expert case tracing to contain active areas especially in high-risk institutional settings such as offices, hospitals, schools, aged-care facilities, tourist accommodation and prisons (all likely to be areas of significant future multinational corporate investment) (22). The aim of identifying common breeding sites and elimination of both larval and adult populations of Ae. aegypti and Ae. Albopictus is detailed, for example, in the Framework for surveillance and control of dengue virus infection in Australia (23) and the Queensland Health Dengue Management Plan.
A specific Constitution authorisation exists for Australian federal legislation to control such population-focused infection-control problems. Section 51 (ix) of the Constitution of Australia grants the Commonwealth legislative power to create laws on the subject of quarantine. Since 1908, the Quarantine Act based on this constitutional power enabled the Commonwealth to take protection and prevention measures to ensure highly infectious diseases and agricultural pests did not enter and proliferate in Australia. Section 2A of the Quarantine Act established the Australian Quarantine and Inspection Service (AQIS) and initiated the federal takeover of all quarantine stations in Australia from state and territory hands. The Quarantine Act was drafted when most travellers and goods arrived in Australia by ship and its focus was to protect Australia from outbreaks of more readily 'quarantinable diseases' such as bubonic plague, small pox, yellow fever and cholera (24). Visitors to the Quarantine Station near the entrance to Sydney Harbour can still see the names of many of those quarantined there carved in sandstone rock.
There is a tradition of viewing quarantine law as "anti-commercial, anti-social and anti-Christian." Nonetheless, Australia invested in regulating this area and developed one of the most strictly enforced quarantine systems in the world, amending the relevant legislation on numerous occasions in response to increasing trade and travel, technological advancements, agricultural expansion and emerging biosecurity threats (25). Yet, as Australia grew as a nation and increased its international trade, immigration and refugee programs throughout the late 20 th century a number of highly politicised and damaging incursions of exotic pests and diseases took place (26). This resulted in reviews to evaluate the efficiency and effectiveness of the quarantine system. The most significant of these were the 1996 report titled Australian Quarantine a shared responsibility (Nairn Report) (27) and the more recent One Biosecurity (Beale review) (28). One of the main areas of focus of such reviews was the potentially negative impact of strict quarantine on international trade in goods and services and so on corporate profits.
The Nairn Report provided 164 recommendations, of which the then federal labour government accepted 149. One of those rejected recommendations included the establishment of an independent statutory body to oversee the activities of what was then known as the Department of Primary Industries and Energy. The Nairn Report recommended staggering regulatory checkpoints at pre-border, border and post-border compliance monitoring stations, as well as continuous use and improvement of scientifically based risk analysis to drive a targeted, transparent, industryfriendly and cost-effective compliance monitoring programs. It highlighted the need for using databases and information technology to detect import threats, target staff resources to high risk border activities and establish quality assurance arrangements with low risk corporate importers. The improved risk analysis method promoted in the report ultimately comprised of risk assessment, risk management and risk communication (29).

Transition from Quarantine to Biosecurity Regulation
A decade later, an independent review of Australia's quarantine and biosecurity arrangements was conducted by an eminent four-person panel, led by former senior public servant Roger Beale (30). The Beale review found that Australia's biosecurity system operated well and was the envy of many countries due to its comprehensiveness, transparency, acceptance by industry and scientific rigor. Yet the Beale review put forward 80 recommendations to the government which revolved around 3 principles: 1) an integrated biosecurity continuum involving risk assessment and monitoring, surveillance and response pre-border, at the border and post-border; 2) risk assessment reflecting scientific evidence and rigorous analysis; and 3) shared responsibility between the Commonwealth, state and territory governments and between businesses and the general community. It also responded in a pro-corporate manner to calls from the community activists for a zero-risk quarantine policy, by labelling it unattainable and undesirable. To operate this type of system would mean that every passenger, every bag and suitcase, and every container of cargo would need to be searched and even sampled and analysed. The Beale review acknowledged that this would not only be counterproductive, but also be impossible to resource. The review recommended that it was time for the Australian Government to rid itself of the term 'quarantine' in favour of 'biosecurity'. The panel stated that 'quarantine' carried a negative and defensive connotation as opposed to 'biosecurity' supposedly being more pro-active and linked to the militarisation of borders that is another consequence of resource disruption and population movement due to corporatgenic climate change. The most influential recommendation to come out of the Beale review was the endorsement of new legislation to replace the Quarantine Act 1908. Among the supporting recommendations the panel insisted that the new Act draw on a much broader set of constitutional powers, many of which appear directly relevant to controlling an infectious disease outbreak such as Dengue Fever, including: • provisions to deal with national biosecurity emergencies • additional powers and resources to regulate post border biosecurity • legislative power to deal with international and domestic water ballast regulation • powers to override state and territory law (subject to the NBA) The new biosecurity legislation would also be underpinned by a National Biosecurity Agreement (NBA) in order to improve communication and collaboration with all states and territories. The panel recommended the establishment of a National Biosecurity Commission (NBC), including a Director of Biosecurity. This office would be responsible for biosecurity policy determination, import risk analyses and state biosecurity controls. Second, the panel proposed to combine AQIS, Biosecurity Australia and the Product integrity, Animal and Plant Health Division to form a National Biosecurity Authority (NBA). The Authority would be responsible for administering the new Biosecurity Act. Third, the panel recommended an independent office of the Inspector General of Biosecurity (IGB) be established to conduct internal audits of the National Biosecurity Authority.
The government initially heeded the advice of the panel by agreeing in principle to adopt all 84 recommendations. The Commonwealth's response acknowledged that factors including Australia's evergrowing reliance on corporate-coordinated trade, increases in passenger numbers and cargo, and outbreaks of disease have exposed significant weaknesses in the current system. Furthermore, the threats of agriterrorisim and climate change were areas of growing concern. In March 2012, the government published an internal departmental report stating that the establishment of the NBA and NBC with their powers of corporate control and investigation was rejected by the government but upholding its initial Beale review commitment to create a statutory office of the Inspector General of Biosecurity to provide independent oversight of departmental biosecurity functions (31). The relevant government department received over 100 submissions providing comments and recommendations regarding the new law and the Biosecurity Legislation Implementation Program.
The report outlined an investment of $481 million into the biosecurity reforms over the next four years and confirmed that an Intergovernmental Agreement on Biosecurity had been signed by the Prime Minister and all states and territories except Tasmania in January 2012. The agreement was designed to strengthen partnerships between the Commonwealth, state and territory governments as proposed in the The Biosecurity Act 2015 is a 723-page document, more than two and half times the length of the Quarantine Act 1908. Under the Biosecurity Act 2015 quarantine officers are now referred to as Biosecurity Officers and Biosecurity Enforcement Officers. Items and vehicles have been grouped together to avoid confusion and streamline inspection procedure. Imported cargo, plant and material (mostly multinational corporate-owned) have been categorised as 'goods', and aircraft, vessels and ships are now termed 'conveyances'. The Quarantine Act 1908 grouped together vessels, persons and goods which made the requirements for individual biosecurity risks difficult to isolate and interpret. 53 The Biosecurity Act 2015 has simplified this issue by dividing individual biosecurity risks and their requirements into four chapters comprising: human health; goods; conveyances and ballast water and sediment.
The biosecurity risk chapters are followed by supporting chapters outlining administrative tools including monitoring, control and response, and compliance and enforcement. Powers to manage biosecurity outbreaks and emergencies have also been closely defined in the new legislation. Where the Quarantine Act 1908 has these powers littered throughout various parts of the Act, the Biosecurity Act 2015 has collated them together by allocating an entire chapter to dealing with emergencies. The layout improvements to the Biosecurity Act 2015 certainly warrant recognition. An important operational change within Chapter 9, Compliance and Enforcement, is the adoption of the Regulatory Powers (Standard Provisions) Act 2014 (RP Act). Monitoring, investigation and warrant powers of biosecurity officers within the Biosecurity Act 2015 must be carried out in accordance with the RP Act. The Parliament's aim here is to provide a standardised framework for federal compliance and enforcement powers that relate to regulatory schemes conducted under Commonwealth statutes (32). The new Australian biosecurity regulation shifts to a model of risk-based regulation; it allocates resources in proportion to the risks to society, considering both the impacts themselves and the likelihood that they will occur, as recommended by the Nairn Report and the Beale Review. Goods that are imported into Australia (for example those that might contain mosquito eggs) are now required to be evaluated against Biosecurity Import Risk Analysis (BIRA). This system is designed to identify conditions that need to be met by the corporate importer in order to manage the level of risk associated with the goods (i.e., preventing infestation with mosquito eggs). The Biosecurity Act 2015 in Chapter 1 defines the appropriate level of protection for Australia against biosecurity risks (such as Dengue Fever) as a 'high level of sanitary and phytosanitary protection aimed at reducing biosecurity risks to a very low level, but not to zero'. The Biosecurity Act 2015 (Cth) can be viewed as pro-corporate, particularly in not creating strong regulatory oversight mechanisms such as a National Biosecurity Commission or a National Biosecurity Authority. Consequently, it could have been stronger in facilitating the capacity of authorised officers under the legislation to impose restrictions on corporate investments in traded goods and services such as those necessarily imposed in cases of Dengue Fever outbreak, as that studied here.

Biosecurity Restrictions and Indigenous Populations
Australia's indigenous populations have a significant density in Northern Australia. If climate change-initiated infectious disease epidemics occur in those regions, they are one group likely to be significantly impacted by biosecurity measures, particularly as such measures controlled by State enforcement apparatus are likely to be negatively viewed by such indigenous people as a recrudescence of colonial domination, discrimination and cultural oppression.
According to some climate change scientific models, the Dengue Fever-bearing mosquito A. Imposition of biosecurity controls under the new Federal legislation against Dengue Fever in indigenous communities could create much political controversy. The health outcomes of Aboriginal and Torres Strait Islander peoples already are significantly worse than those for non-indigenous Australians (33). As a result, most Indigenous Australians experience lower overall life expectancies, higher age-specific mortality rates, higher incidence chronic diseases, and a high incidence of sleep-related disorders (33,34). This gap is mostly due to systematic discrimination amidst several social determinants (33,34). Some relevant key factors would be directly relevant to a future Dengue Fever outbreak: the lack of equal access to primary healthcare and the poor standard of health infrastructure (eg. housing, sanitation, food etc.) in Indigenous communities (35).
In 2013, following tireless Indigenous campaigning to 'Close the Gap' between Indigenous and non-Indigenous Australians with respect to health and life expectancy, the Federal Government initiated the National Aboriginal and Torres Strait Islander Health Plan 2013-2023 (36). One of the plan's key priorities was to adopt a human rights focus in its approach towards reducing inequalities in health outcomes for Indigenous peoples. At a national level, human rights protections have yet to be enshrined into the Australian Constitution, let alone on a statutory basis. Problematically, this renders citizens defenceless against the government, particularly when it seeks to employ restrictive measures to prevent infectious disease spread such as those available under the Biosecurity Act 2015 (Cth). Some Australian constitutional provisions could be argued to confer human rights-related protections, like Section 117, which essentially prohibits the forming of legislation with the effect of discriminating citizens based on their State of residence, in relation to inequities in funding for medical and other related health services (37). But those cases have yet to be successfully argued before the Australian High Court.
An Indigenous Australian engagement model on such emerging biosecurity issues has been developed from previous plant biosecurity operations of Mimosa pigra on Aboriginal land in the Northern Territory taking into account specific relevant values in Aboriginal culture. For the Warramirri, Mak Mak Marranunggu and other Yolŋu language groups, key values include Djakamirr (empowerment), Raypirri-Wadatj (discipline), Marri-Yulkthirr Ga Gurrutu (trust and relationships), Rom (authority), Nhama Manymakum (respect) and Gumurrkunhamirr (partnership) (38). Such values have a manifest disconnect with the core value of shareholder profitmaking presently required by legislation to be central to corporate operations. Protecting such indigenous values, as discussed in the next section, will create significant human rights issues.

Balancing Biosecurity Powers with Corporate Investor, Individual and Community Rights
Internationally, the United Nations Committee on Economic Social and Cultural Rights recognises 'health' to be a fundamental human right, which the International Covenant on Economic Social and Cultural Rights (ICESCR) defines as the right to enjoy the 'highest attainable standard of physical and mental health' (39). As it is a party to this covenant, the Australian government has an obligation to progressively realise these rights, and so must take steps within the maximum of its available resources' (39). In taking the necessary steps, the government is also under an obligation to not discriminate (39). This brings into view the health disadvantages suffered in Aboriginal and Torres Strait Islander communities, which are likely to be exacerbated in situations of Dengue Fever and other climate change-initiated infectious disease 'quarantine' under the new federal biosecurity legislation.
In 2009, Australia adopted UN's Declaration on the Rights of Indigenous Peoples, which also contains rights pertaining to health, similar to those in the ICESCR. Furthermore, the UN Committee has stated that Indigenous peoples have right to specific measures to improve their health outcomes that are consistent with traditional cultural values (40). However, a key problem is that these obligations conferred by UNDRIP are not legally binding (41). This leaves indigenous communities in Australia particularly vulnerable, if Australian biosecurity legislation is used to: • Abate public health nuisances and destroy dangerous or contaminated materials; • Collect data and records to facilitate the early detection of a health emergency; • Take private property with just compensation as needed to care for patients or protect the public's health; • Close roads, implement curfews, and evacuate populations where justified; • Collect specimens and implement safe handling procedures for the disposal of human remains or infectious wastes; • Test, screen, vaccinate, and treat exposed or infected persons; • Separate exposed or infected individuals from the population at large to prevent further transmission of communicable conditions; • Seek the assistance of out-of-state healthcare volunteers through licensure reciprocity; and • Inform the population of public health threats through media and language that are accessible and understandable across cultures (42).
Indeed, the application of the new Australian biosecurity legislation could become a "… stark expression of the view that a public health emergency might necessitate the abrogation of privacy rights, the imposition of medical interventions, and the deprivation of freedom itself." This is particularly so with respect to indigenous populations (43). Indeed, it could become an important issue whether medical practitioners that are legally obligated to apply powers under the Biosecurity Act 2015 (Cth) through their conditions of employment would have protection from legal liability for infringing the rights of individuals and communities (44).
The UN have suggested that the scope of the 'right to health' in the ICESCR extends beyond a disease prevention model, to recognise community rights to the basic preconditions of health (eg. food, water, housing, healthy environment) (45). This obligation also involves the right to access the necessary conditions or healthcare services required for delivering the highest attainable standard of care (45). This would include access to medical services necessary to treat a Dengue Fever outbreak.
The United Nations Declaration on Indigenous Rights (UNDRIP) also contains similar protections for the Health of Indigenous peoples (46). More specifically, with regards to the content of Article 12.1, UNCESCR clarifies the three main elements which comprise the 'right to health': availability, acceptability and quality (47). This element requires health facilities, programs and services to be delivered in a culturally sensitive manner, which protects the human dignity of traditional practices (48). The availability of these health-related goods and services is also essential for these Indigenous communities exposed to infectious disease pandemics regulated by the new Biosecurity Act 2015 (Cth), as they are geographically disadvantaged, compared to nonindigenous metropolitan regions in Australia (49). Following this line of reasoning, health rights for Aboriginal and Torres Islander peoples ought to allow for the conditions necessary to equal treatment and to advance wellbeing on their terms (50). They should also allow biosecurity measures to be applied in a way that respects indigenous law, culture and traditions. This could be viewed as an aspect of the right to selfdetermination, recognised internationally (51) and within domestic jurisdiction (52). It would emphasise empowerment to ensure that Aboriginal and Torres Strait Islanders have the maximum opportunity to partake in the planning and implementation of their own response to an infectious disease pandemic or Dengue Fever outbreak. Aboriginal controlled health services could play a major role in negotiating such arrangements (53). They could ensure that even in biosecurity emergencies, holistic and comprehensive and culturally appropriate health care is still delivered to the affected communities (54).
Yet alongside such considerations must now be placed the right of multinational corporations operating in Australia to bring investor-State dispute Settlement actions before a panel of private arbitrators not subject to standard rule of law mechanisms such as stare decisis (following precedent), an appeals process or restrictions on conflict of interest. The most "Nothing in this Chapter shall be construed to prevent a Party from adopting, maintaining or enforcing any measure otherwise consistent with this Chapter that it considers appropriate to ensure that investment activity in its territory is undertaken in a manner sensitive to environmental, health or other regulatory objectives" (55).
Yet, the word 'sensitive' is soft in a legal sense; it must be massaged in proceedings to create a hard obligation. The same is true of the requirement for 'corporate responsibility' in 9.17. The harder edged provision: "Non-discriminatory regulatory actions by a Party that are designed and applied to protect legitimate public welfare objectives, such as public health, safety and the environment, do not constitute indirect expropriations, except in rare circumstances" (56) is relegated to (3(b) of Annex 9B -an annex in such an agreement being notable for its restricted capacity to apply to all parties. It is, further, edged with the exclusion 'except in rare circumstances.' The United States though not part of the CPTPP at inception will no doubt join it (giving US companies access to its mechanisms) once President Trump is removed from office. Interestingly Australia and New Zealand appear to have agreed not to all their own companies to bring investor state actions against each (57).
The risk here, as the looming public health and environment catastrophe of corporatogenic climate change becomes apparent is that multinational corporations will use mechanisms such as investor-State dispute settlement in the CPTPP to in effect bully governments at the pre-legislative policy stage against taking biosecurity measures likely to impede their profits.

Conclusion
Dengue Fever activity is increasing in many parts of the tropical and subtropical world as a result of not only rapid urbanisation in developing countries and increased international travel, but also the decisions of multinational corporations, particularly the oil, coal and gas industries and those benefitting financially from them, to continue burning carbon-rich fuels that atmospherically trap the sun's heat. The potential for the Dengue Fever outbreaks is likely to rise in this era of corporatogenic climate change as breeding conditions for the relevant mosquito expand. In such situations the application of Australia's Biosecurity Act 2015 (Cth) may create human rights issues not just for medical practitioners employed by public health authorities governed by the legislation, but also for indigenous communities in Northern Australian regions. Further, the capacity of Australian governments to taking legislative biosecurity actions against such threats in the interest of public health and the environment itself may be impeded by new corporate investor dispute rights incorporated in agreements such as the CPTPP.

Conflict of interests
The authors have no conflicts of interest to declare.

Introduction
Smallpox was declared eradicated in 1980, with known seed stock retained in two high security Biosafety Level 4 laboratories in the United States and Russia (1). In the decades since eradication, the risk of smallpox has been thought to be from clandestine stockpiles of virus outside of the official repositories. Experts agree the likelihood of theft from these laboratories is low, and that synthetic creation of smallpox is a theoretical possibility (2). Until 2017 it was believed that synthetic smallpox was technically too complex a task to be a serious threat. However, in 2017, Canadian scientists synthesised a closely related orthopoxvirus, horsepox, using mail order DNA and $100,000 (3). The experiment was not detected by any defence, intelligence or security surveillance systems, and was not known until the scientists themselves informed the WHO (3). In 2018, terrorist groups declared an intent for biological attacks against Western societies (4). There is capability for such an attack, with the recent open access publication of methods to manufacture an orthopoxvirus (5). The genome of the smallpox virus, variola, is publicly available, and the world's population is largely susceptible to smallpox due to the cessation of smallpox vaccination programs nearly 40 years ago and waning vaccine immunity in vaccinated older adults (6).
The Pacific is a unique and highly diverse geographic region and includes large islands such Papua New Guinea and Fiji, and small island nations such as Kiribati, with many islands and informal maritime transport networks. The region is affected by many disasters such as cyclones, tsunamis, volcano eruptions, earthquakes, rising sea levels and political conflict, which create systems vulnerability to infectious diseases epidemics (7). The Pacific Island states bear a disproportionate burden of the global crisis of human resources in health because of weak health systems, insufficient production of trained health personnel and significant outward migration. Limited diagnostic and therapeutic capacity and the lack of funding for simple diagnostics and for therapeutic monitoring also impact on epidemic response.
System problems such as coordination across countries, jurisdictions, agencies and disciplines, including those outside of the health system, may hinder emergency response to epidemics. A key aspect of strengthening health security during a bioterrorism incident is improving collaboration of responses between health, emergency management, defence, law enforcement and other sectors. The Pacific region is a critical part of the world in view of its geo-political strategic significance and unique vulnerabilities, which make control of infectious diseases a greater challenge in this region than elsewhere (8). A smallpox epidemic in the Pacific could spread globally and could be challenging to contain due to dispersed island geography, informal maritime travel and shortage of human resources. In this context, a smallpox simulation exercise was held in August 2018, with a focus on bringing together international stakeholders from a wide range of sectors including health, defence, law enforcement, emergency management and relevant non-government organisations.

Exercise Aim
To review preparedness for a bioterrorism attack in the Asia-Pacific region and globally.

Box 1. Dr Jona Mataika
Dr Jona Mataika was a renowned medical professional, both locally and internationally, for his pioneering role in the filariaisis programme in Fiji. He was also the pioneer in the establishment of the virology and filarology services in the country and the region. Dr Mataika also served on the World Health Organization (WHO) steering panel on parasitic diseases. His research has been published widely and used extensively. He was awarded the Order of the British Empire in 1986 for his contributions to medical services. He served the medical sector from 1947 until his death in 1999. Mathematical modelling of smallpox transmission (9) was used to simulate the epidemic under different conditions and to test the effect of interventions. An interactive format was used to explore decision making during the scenario. This paper has been prepared based on discussions during the exercise, and expert input from participants.

Participants
Stakeholders from government and nongovernment organisations from Australia, New Zealand, several Pacific Island countries (PNG, Tonga, Vanuatu, Fiji, FSM, Samoa, Guam), the United States of America (USA) as well as industry and nongovernment organisations based in the United Kingdom, Singapore, Denmark and Switzerland were present.

Exercise date and location
Exercise Mataika was held on August 16, 2018 in Sydney, Australia.

Exercise format
An outbreak simulation tabletop exercise was developed by the ISER team at UNSW. The exercise alternated between clinical, public health, emergency and societal responses, with participants discussing cross-sectoral capability in responding collaboratively across the region and the world. Key weak points that are influential in determining the final size and impact of the epidemic were identified (based on mathematical modelling of transmission in Fiji and globally).Participants analysed the scenario from start to finish and identified and discussed key interventions that could prevent the worst possible outcome.
This included identifying which determinants of epidemic size are potentially within our control, and which are not, thus providing a framework for interventions to prevent and mitigate an epidemic of smallpox. Based on the scenario and discussions about response, recommendations were made to guide improved and more rapid and effective responses.

Scenario description
A first case of haemorrhagic smallpox occurs in a private hospital in Fiji, but the diagnosis is missed, as clinicians are not familiar with the disease. It is not until multiple cases are reported to the Ministry of Health and Medical Services that smallpox is considered as a diagnosis. The index case in the scenario was based on the index case in the Yugoslavian outbreak of 1972 (10). The patient had haemorrhagic smallpox, making the rash less obvious than the classic form. The index case in Fiji is misdiagnosed as having an adverse reaction to an antibiotic, which is what occurred in Yugoslavia, a country that had not seen a case of smallpox for over 30 years at the time (10). While autopsy results are awaited, more cases start appearing. A team of four epidemiologists from WHO responds to assist with the outbreak investigation while the diagnosis is still unknown. They, together with local public health officials, consider chickenpox, dengue, monkeypox and smallpox as a differential diagnosis. Samples are sent to Australia for testing. Days after the first case presented, case numbers have risen to at least 200. Initial case fatality estimates are about 40%. The health system is overwhelmed, with multiple hospitals treating cases and media reports causing public panic. Test results confirm variola virus on a Friday afternoon, 13 days after the index case presented, and the WHO promptly declares a Public Health Emergency of International Concern. Hundreds of cases have occurred by this time and case interviews determine that all were at Nadi International Airport, either as travellers or visitors, on August 1 st , making this the likely day of infection. Smallpox has an average incubation period of 12 days, with a range of 7-17 days. The index patient presented 12 days after landing at Nadi airport, supporting the airport as the likely site of infection. Law enforcement agencies and military are called in to investigate.
The WHO vaccine stockpile is comprised of 2.7 million doses of first-generation vaccine held in Geneva and 31 million doses (about 2/3rd second generation vaccine) pledged by various member states (11). Vaccine is deployed by WHO on day 27 postrelease, the Monday after the diagnosis, reaching Fiji on day 28. However, the public health teams tasked with the initial response are unvaccinated, so they must first be vaccinated and protected before deploying to vaccinate others. Vaccine take occurs after 7 days, so a decision is made to deploy 7 days after vaccination, although there is evidence for protection earlier than this (12). After travel and logistics are arranged, vaccination begins on day 40 in Fiji.
In this scenario, ring vaccination is used. Ring vaccination requires tracing and vaccinating all contacts of smallpox cases, with contacts prioritised by the closeness and degree of contact. Ring vaccination was used to eradicate smallpox and is the most efficient vaccination strategy to control the epidemic if vaccine supply is limited (13).
Forensic investigation by local agencies and Interpol identifies a bioterrorism attack to have taken place at Nadi International Airport in Fiji on August 1st, with many people infected simultaneously and some travelling onward to other countries on day zero. The airport is closed on day 25 post-release, for decontamination and forensic investigation. Many people in Fiji are desperate to leave, but tourists and locals alike are trapped, although boat travel increases and locals move to outer islands and other Pacific Island nations through informal, undocumented travel.
The phylogenetic analysis shows a likely engineered strain. Clinically, it is responsive to available antiviral drugs and vaccine appears to be highly protective. The clinical response comprises case finding, isolation and supportive therapy. There are no supplies of the antivirals cidofivir, brincidofivir or TPOXX in Fiji at this time, and there is limited human clinical evidence of the use of these drugs.
Other pressing issues include protection of health workers and other first responders, crisis communication and management of the worried well. Fiji has 24 public hospitals, 3 private hospitals and 1 military hospital, with a combined total of 1753 hospital beds. By day 25 there are already >2000 smallpox cases, exceeding the total available beds. Other urgent medical care, such as myocardial infarction and trauma, is compromised. Of the 2800 nurses in Fiji, 500 are infected and 320 are dead by day 30. There are 873 doctors in Fiji, of whom 185 are infected and 79 are dead. The health system is in crisis, and there are few other clinicians to draw upon. The Fiji Nursing Association calls a strike, demanding vaccination and personal protective equipment (PPE), which are in short supply. Conflict between private and public hospitals occurs, with rumours that vaccine and PPE will be prioritised for workers in public hospitals.
Based on modelled smallpox transmission using a published model (6), adapted for Fiji and the world, we follow the epidemic as it spreads across the globe in a matter of weeks. The attack at the airport results in cases arising in several other countries from people travelling out of Fiji on day zero. Smallpox has a R0 that may be as high as 4-5 (6) and is therefore potentially more infectious than influenza (R0 ~2) (14) or Ebola (R0 ~2) (14)(15)(16)(17). It is spread by the respiratory route and rapidly propagates in a largely non-immune population (6). In the period prior to eradication, smallpox epidemics occurred often due to importations of smallpox by a single infected person, but in a deliberate attack there are likely to be hundreds or more infections on day zero, which makes it much more difficult to control the epidemic, especially as infected people disperse around a highly interconnected world.
Cases that were infected in the initial attack at Nadi International airport have occurred in multiple different countries, and second-generation cases are appearing overseas. Law enforcement investigations identify the method of attack and uncover possible planning for a second or multiple other attacks on the Dark Web. Identification of perpetrators is difficult, but there appears to be a large network of global colluders, which are using cryptocurrency for financial transactions to support their activities.
As the epidemic spreads globally, Australia, New Zealand and other international carriers cease all flights to and from Fiji. Meanwhile, locals and stranded tourists desperately try to escape Fiji. Illegal boat travel escalates between islands in Fiji and within the Pacific, including boats of infected people approaching New Zealand and Australia. The boats are not allowed to land, creating ethical dilemmas and a media frenzy. Cruise ships companies immediately divert and avoid Fiji, and other ports refuse entry to cruise ships which have passed through Fiji. Food and supplies are running short on stranded cruise ships. Regional governments begin to pressure Fiji to assist with evacuation of their nationals.
Multiple conflicting requests and demands are made of Fiji and its government. On the ground responses from key allies of Fiji are not forthcoming immediately, although advice is provided on conference calls and essential supplies are provided by air drop. Countries are also focused on managing their own domestic cases of smallpox by now. There is resistance to military or health deployment into Fiji from other countries, due to a minimal risk appetite and a protectionist mentality exacerbated by upcoming elections in some countries. WHO GOARN puts out an alert calling for volunteers to respond. Compared to past outbreaks, there far fewer offers from trained epidemiologists and 10/39 offers are from people with contraindications to second generation vaccines, leaving 29 potential immediate responders. Another group of 50 offers from semiskilled or inexperienced people are assessed for suitability for deployment. US CDC offers 10 people, but the remainder of their public health teams are working on their own domestic response.
In Fiji there were >1000 first generation cases infected at the airport and >5,000 second generation cases, with case numbers rising rapidly. The Fiji MOH is conducting contact tracing but has over 100,000 contacts to trace and only 50 trained public health staff and 20 NGO volunteers, none of whom are yet vaccinated. Non-government aid agencies are unable to come to Fiji because of travel bans. With a shortage of hospital beds for patients, the issue of who will trace contacts and where they will be quarantined is discussed in Fiji and other affected countries. Community mobilization is recognized as critical.
As 32,000 doses of vaccine arrive in Fiji, a larger scale attack occurs in a much larger, more populous country in Asia. With resources focussed on Fiji, this catches the world off guard and stretches the limited global stockpile of vaccine. Globally, critical delays occur in coordination of the response, including the need to vaccinate first responders before they can deploy. Staff need to be trained in vaccination procedures, care of the vaccination site and assessing vaccine take. Vaccinating the vaccinators and procuring supplies of bifurcated needles cause some delays. As the epidemic escalates, hospital beds reach capacity and other industries are affected by severe absenteeism. Lack of resources, including human resources, is a major problem. Modelling shows that the epidemic is most sensitive to case isolation, contact tracing and vaccination, and speed of response (9). Speed of response for isolation, contact tracing and vaccination is most critical in the early stages of the epidemic.
Shortages of human resources and physical space to isolate cases are a problem, and health workers are dying of smallpox. Community engagement and mobilisation are recognised as essential but are not well coordinated, and crisis communication is poor. In a worst-case scenario, at the peak of the epidemic, worldwide, only 50% of smallpox cases are isolated (mostly through use of community volunteers and use of makeshift buildings as isolation facilities) and only 50% of contacts are tracked and vaccinated, causing a catastrophic blow-out in the epidemic. Under these conditions, modelling shows it will take more than a billion doses and 10 years to stop the epidemic (9). The WHO stockpile comprises less than 10% of doses held by WHO, with the remainder of doses being pledged from donor countries (11). Stockpiles of certain countries remain unknown, but WHO estimates there may be up to 900 million doses in the world. The world's population is 7 billion. There is up to a 12-18 month lag time in vaccine production, and it is estimated that 300 million doses could be produced in this time by the very few producers of smallpox vaccines globally. In the scenario, countries are reluctant to provide pledged doses, as they are facing domestic epidemics of smallpox. Fiji must manage with 32,000 doses and must decide the best use of these doses. Discussions about diluting the available vaccine are held. The U.S. sends 1000 doses of the antiviral drug TPOXX from their stockpile to Fiji early in the epidemic but retains the rest for domestic smallpox cases.
Critical infrastructure, travel and trade are affected, and countries scramble to get access to limited antiviral drugs, vaccine and personal protective equipment supplies. Foreign aid is reduced as countries divert resources to managing their own crises.
Managing communications becomes challenging. Rioting besets major cities and both military and police responses are required. Mass gathering bans are implemented in Fiji and other countries. A black market has emerged in illegal boat travel, with irregular movements between outlying islands increasing and limited capacity to patrol all parts of the maritime border. Border disputes occur between countries. By day 40 post-release, the epidemic has spread to 26 countries. Around 50% of staff at key services in affected countries are absent during the peak of the epidemic. Reasons for absence include fear, family obligations and illness. Basic services supporting the economy and critical infrastructure including power are now impacted and economic activity is severely impacted. Supply chains are disrupted globally, causing shortages of essential medicines, supplies and food.
PPE is in short supply and vaccine is prioritised for health workers. Health workers, police and military are dying of smallpox, leaving systems weakened and unable to cope with the response. There is not enough vaccine, antiviral or personal protective equipment for health workers, police and military, who require protection as critical first responders. Police use riot gear as improvised PPE, but supplies are minimal. Health workers use home-made PPE. Other at-risk groups such as mortuary workers, waste services, cleaners and service personnel are also affected. Management of dead bodies and disposal of medical waste is a major problem, with transport companies refusing to transport medical waste.
In the final phase of the epidemic, which becomes a pandemic, the workforce is decimated, leaving critical infrastructure, transport, power, communications and food supplies compromised. Government assets are generally dispersed, depleted, and not readily available, resulting in severe conflicts regarding prioritization of limited supplies to health, police and border protection. Dissent is quashed using various means and penalties for insubordination are increased in uniformed services. Key modern systems become unreliable, including wireless and data communications, economy and banking (cash supply), replacement parts and manufactured items, processed food, and medications.
Globally, due to lack of human resources and physical space for patient isolationbo and the larger attack in a highly populated developing country, only 50% of case are isolated and 50% of contacts traced and vaccinated. Recovered people are mobilized to help with contact tracing and case finding, but food supplies are short and resilience is low. Vaccine production by the few manufacturers is occurring but cannot meet demand. Available supplies go to wealthier countries and not to the areas of greatest need where transmission is most intense. A major donor's funding is helping novel vaccine development and scaled up production. Trials of reduced dose schedules have commenced and accelerated vaccine development has been approved, with mixed academic and public reaction. Ethicists are alarmed about the possible harms of rapidly implementing human experimentation and caution that the risks may outweigh the benefits. Misinformation and poor crisis communication exacerbate the situation. Differentiation between accurate and inaccurate information is now impossible. Reported information about case numbers, fatalities and affected regions vary drastically. Many governments attempt to control information and establish authoritative information sources, but frequently contradict themselves. Trust in government and authority structures has disappeared, and legitimate attempts at communication by authorities are viewed with suspicion and fuel conspiracy theories. Rural areas, including Pacific islands, are more resilient due to retained skills in subsistence living, including basic primary healthcare, but large urban cities are badly affected. Mass displacement and migration of human beings occurs within countries and across national borders. This situation may meet the definition of a Global Catastrophic Biological Risk (GCBR) event (18). The final impact of the pandemic is more severe than a single nuclear strike and societies are left decimated.
Societal recovery worldwide starts from a lower baseline than in the preepidemic era.

The Response
Key factors that are influential in determining the final size and impact of the epidemic were identified (empirically and based on mathematical modelling of transmission in Fiji and globally). Input was provided from multinational experts in health, defence, law enforcement and emergency management. Based on the scenario and discussed response, recommendations were made to guide improved, more rapid and effective response. The purpose of exercising a severe scenario was to analyse the conditions that gave rise to the situation and how these can be modified and mitigated. Participants analysed the scenario from start to finish and identified and discussed decision making and key interventions that could prevent the worst possible outcome. Polling software was used to record individual decision making, results were provided in real time to the group, and participants reviewed responses and reached consensus. To conclude the exercise, participants identified determinants of epidemic size. These were then divided into those which are potentially modifiable, and those which are not, thus providing a framework for feasible interventions to prevent and mitigate an epidemic of smallpox ( Figure  1). The general principles would apply to prevention and mitigation of any contagious serious infectious disease. Key recommendations around each of the modifiable factors shown in Figure 1 are summarised below.

Recommendations on modifiable determinants of a smallpox attack
The recommendations arising from discussion at the workshop are summarised in Boxes 2-11.

Box 2. Preventing an attack through intelligence, law enforcement and legislation
Identifying and stopping bioterrorist attacks before they occur through a combination of intelligence, law enforcement and new legislative approaches is the most effective primary prevention approach and should be given high priority.

Legislation
Technology has advanced at a more rapid pace than legal or regulatory frameworks and the risk posed by technologies such as synthetic biology and dual-use research of concern is not fully understood. The synthesis of an extinct horsepox virus closely related to variola in 2017 not only showed that smallpox can be created in a laboratory, (2) but the methods to do so were published in an open access journal in 2018 (5). Many capabilities for such an attack (such as synthetic biology and genetic engineering) are self-regulated. Attempts at risk analysis of dualuse research of concern by the U.S. and the European Union have been inconclusive (19,20). Discussions around regulation have been held mainly within the health and scientific communities, but global discussions are required with broader stakeholder groups including defence and intelligence agencies, law enforcement and legal experts. Global legislation and regulation could be improved for prevention of such an attack. Available tools include the International Health Regulations (applicable only after an attack), The Cartagena Protocol (for transport of genetically modified organisms), the Biological Weapons Convention (BWC), and domestic criminal and anti-terrorism legislation. The latter may have greater powers to stop a planned attack but vary by jurisdiction and country and have not been widely used or tested against planned biological attacks. The BWC focuses on nation states and is not enforceable. The Global Health Security Agenda (GHSA) Action Package Respond-2 touches on some relevant aspects, but is more relevant to response than prevention and is a voluntary framework that remains health-centric in scope (21). Given the risk of insider threat (22), legislation to create greater accountability of scientists should be explored, but is likely to be met with resistance from the scientific research community. The Thomas Butler case saw radically different perspectives of the law enforcement and scientific communities, with the latter opposed to the prosecution of a renowned scientist (23). Anti-terrorism laws, which allow enhanced police powers, are framed around the risk posed to society outweighing individual rights and are controversial (24). In considering the need for legislation specifically to stop potential bioterrorism, the risk posed to the community by a suspected rogue scientist working with an infectious pathogen in a legitimate, DIY or clandestine laboratory must be weighed against the individual rights of that scientist. The potential harm of a biological attack with an infectious agent may be greater than a physical act of terrorism because a contagious pathogen may spread and cause harm beyond the initial attack. Existing anti-terrorism laws could potentially be used to prevent planned bioterrorism but may need to be modified. We recommend a global dialogue, as well as dialogue between scientists and law enforcement to seek new legislative and regulatory approaches to prevention and mitigation of biowarfare and bioterrorism. Given past disagreements and tensions between the scientific and law enforcement communities in high profile insider cases, this will require careful navigation to ensure a positive outcome. It should also be noted that in the event of an attack, the perpetrator may be unknown, so the differentiation between warfare and terrorism may not be possible, with implications for whether defence or law enforcement agencies have primary responsibility.
The important role of the Aviation Sector in managing suspected communicable disease or other public health emergencies on board aircraft and in airports under the Chicago Convention of the International Civil Aviation Organization ((ICAO) -Article 14) must also be considered. Article 14 of the Convention is titled Prevention of Spread of Disease and encourages contracting States to take "effective measures to prevent the spread of communicable diseases" and to collaborate with other relevant agencies to this end (25).
IATA (International Air Transport Association) and ATS (Air Transport Services) also have a major role in identifying travelers who appear unfit to fly, either at the counter, in the passenger lounge prior to boarding or at the time of boarding. The passenger agent should seek medical advice before allowing the ill passenger to check in or to board the aircraft. The traveler may be requested to delay travel until they are well enough or have received medical approval to travel. If a traveler refuses to delay his/ her travel, the airline may exercise their right to refuse boarding.
In the event of illness on board an aircraft, the pilot and cabin crew must report this to the ICAO. All cases of illnesses or deaths on board must be reported to public health authorities (via ATS). The pilot should notify air traffic control, as per ICAO provisions (2-Annex 11), of any suspected cases of communicable disease or evidence of a public health risk on board. IHR Annex 9 "Health Part of the Aircraft General Declaration" is available to be used after landing to report an ill person on board.
Preparing countries through simulation exercises and health systems strengthening measures, including IHR joint-external evaluations, and the development of costed national action plans are also seen as important mitigation measures. Strengthening global response mechanisms through better funding, including strengthening of GOARN, and funding the partner institutions in the region area are also important. Better epidemic preparedness in communities will be of vital importance, as will empowering community volunteers to adequately detect and respond to epidemics in their earliest stages.
For global risk management, public health events of international concern require robust global systems for detection, risk assessment and mitigation, as a platform for effective coordination. WHO's mandated role in this situation is clearly defined through the International Health Regulations and its instruments, such as the IHR Emergency Committee (EC). Timely information sharing between countries affected by multicountry deliberate release events would be essential for proportionate and defensible public health recommendations from the EC, including border closures.

Intelligence and law enforcement
Intelligence and surveillance for planned attacks, including monitoring of orders for materials, equipment and tools required for synthetic biology and genetic engineering of pathogens, is needed. The required intelligence may be different to methods used in detection of planned physical terrorism attacks or may require modification of currently used methods. Intelligence requires recognition of the unique nature of the threat, the role of insider threat and potential involvement of scientists, the possible sites of nefarious biological experiments (legitimate versus clandestine or DIY labs), prioritisation of intelligence gathering, and global discussion and coordination between intelligence and law enforcement agencies, with input from public health experts. We recommend global consensus and cooperation in developing better intelligence and surveillance systems for planned attacks that include rumour surveillance using expert input for key words and terms; tracking of trade in synthetic DNA, laboratory equipment, genetic code and supplies; external regulation of synthetic biology companies; and recognition of insider threats and monitoring of legitimate laboratories for movement of equipment and supplies, especially at the BSL 2 and 3 levels, which is the greater area of risk given strict monitoring of the far less numerous BSL 4 facilities. Whilst many agencies conduct surveillance for potentially harmful experiments, none appeared to identify the synthetic horsepox research (26). Although the researchers presented their research to WHO after it was completed (3), there did not appear to be awareness of the research through surveillance systems of intelligence agencies, highlighting the gap in intelligence and need for better surveillance.

Box 4. Speed and completeness of case finding and case isolation
Failure of diagnosis and triage has been seen recurrently with emerging infectious diseases in a globalised world. For example, during the height of the Ebola epidemic in West Africa in 2014, the diagnosis of Ebola in travellers from the affected area were missed in Nigeria and the U.S, resulting in a preventable epidemic in Nigeria. (27,28). The index case of MERS Coronavirus was missed in South Korea in a returning traveller from the Middle East, giving rise to the largest nosocomial outbreak outside the Arabian Peninsula (29). The last European epidemic of smallpox epidemic in Yugoslavia also featured a missed diagnosis of the index case, with smallpox only recognised after occurrence of second-generation cases and the consequent epidemic (10). There is therefore a high likelihood of delayed diagnosis in the event of a smallpox attack, particularly because it is an eradicated disease, of which the majority of practising clinicians have no clinical experience. The principles of recognition and triage of potentially serious infections should be broadly applied: • Rapid diagnosis of early cases through improved awareness and emergency department triage protocols. Protocols themselves are of no use if clinicians at the front line are not aware of them or fail to use them. Effort must be put into training and awareness of frontline clinical staff in emergency departments and in primary care for key syndromes. • Triage for rapid identification of patients which could reflect a biological attack, which should be based on key syndromes. Syndromes may include severe acute respiratory illness (SARI), rash and fever (RAF), neurological syndromes (such as encephalitis or meningitis), or vomiting and diarrhoea. • Clinical, epidemiological, and contact and travel history aspects should be evaluated as soon as possible, and travel history should be a part of triage. • Isolation protocols should be triggered by the syndrome rather than a laboratory diagnosis. Clinical syndromic triage should be used for early identification of all patients with SARI or RAF in the emergency rooms and the clinics. • Appropriate infection control precautions and respiratory etiquette for source control should be promptly applied.
• Recognise that in some settings, laboratory confirmation will be delayed and isolation must therefore occur based on the clinical syndrome alone. • Patients fitting a syndromic definition should be placed in an isolation room as soon as possible.
• If SARI or RAF patients cannot be evaluated immediately, they should wait in a waiting area dedicated for the RAF/SARI patients with spatial separation of at least two meters between each RAF/SARI patient and others. Source control (use of a surgical mask by the sick patient) may also be beneficial (30). • Rapid diagnostic tests, including point of care tests (POCT), should be developed for smallpox. Such testing will need to include near patient or point of care testing (POCT) and definitive confirmatory testing needs to be available in laboratories with biopreparedness capacity. • POCT and laboratory testing for confirmation are only useful if there is first a suspicion of smallpox, which triggers the use of such technology. • POCT for smallpox must include ongoing experience with such testing platforms and availability of proficiency testing programs (PTP) in order to validate accuracy of test results. Such testing will be maximally useful where there is clinical suspicion of smallpox, although increasingly there are screening platforms for multiple biothreats. It is noted that protocols around ordering a test for smallpox may result in delay of laboratory confirmation and POCT would need to fit into such protocols. • Surge capacity needs to be built into laboratory and clinical services as rapid spread of variola is one scenario. This needs to include training and availability of trainers for rapid POCT testing.
Case finding and isolation is critical, as it may reduce transmission of infection to nearly zero (31). It was a pillar of the eradication program and, in the event of vaccine shortages, case isolation is even more important. Undetected transmission in the community, including through global travel, is a risk to epidemic control. In a large epidemic, case finding and isolation will require: • Mobilising adequate human resources to effect rapid isolation through public health workforce and other workers who are trained in case finding. This requirement is in addition to the clinical workforce, as health workers will be occupied with the clinical response and may also be affected by smallpox. • Identification and designation of large-scale physical spaces for isolation and treatment to ensure that at least 60-70% of cases are isolated rapidly. This could include use of facilities such as hotels or sports stadiums as surge capacity, as hospital beds will quickly run out. • If possible, general hospitals should be avoided for treatment of cases, especially since available antivirals (which would reduce infectivity) may soon run short. • Designated smallpox treatment facilities should be identified, with as few of these sites as possible to avoid nosocomial outbreaks in general hospitals. The response to the Nigerian Ebola epidemic in 2014, which involved use of an abandoned building as the initial treatment centre, is a possible model (32). During the last Yugoslavian outbreak of smallpox, motels on the outskirts of the city were commandeered for case isolation, with separate such facilities for quarantine of contacts. (Personal communication, JM Lane) • In planning for designated smallpox treatment facilities, identify where these will be located and who will staff them (including prevaccinated first responders) and reduce the number of facilities to a minimum. Avoid treatment in primary care and ensure active messaging to avoid patients seeking primary care. • Designated treatment facilities should be spatially designed for triage areas and separate areas for suspected and confirmed cases.
• Active messaging and communication to the community, including symptoms to look for, risk factors, and where to seek help (designated treatment and screening facilities), and how to handle the suspected cases. • A hotline (telephone or web-based) for assistance for people with symptoms.
• Engaging with communities rapidly, using transparent communication and all means of communication, including social media, will be of critical importance given high levels of uncertainty about the geographic scope of further deliberate release. • Mobilising community volunteers, including recovered patients, to assist with case finding and ensuring at least 60-70% of cases are tracked and isolated. • Community volunteers that are susceptible to infection should be vaccinated.
• Military may have a role in the fitting-out of temporary facilities, especially those potentially in green-field sites or stadiums. Military tentage, bedding, catering, hygiene, water purification, fencing, and other items can be mobilised relatively rapidly with endogenous logistic support. This approach was very successful during the 2014 West African Ebola epidemic to improve case isolation and treatment rates (33).

Box 6. Speed and completeness of contact tracing
In the scenario, a delay of 7 days occurs because the vaccinators must first be vaccinated themselves to enable deployment. The scenario design was intended to illustrate the critical nature of a delay of even a matter of days in commencing the response. We used an optimistic assumption of almost-immediate deployment of vaccine from the WHO stockpile, but the process of releasing the stockpile in reality may add further to delays to the time to commencing ring vaccination (11). Following our exercise, WHO is currently planning to conduct a smallpox simulation exercise that would help understand the validity of these assumptions. We recommend: • Having a small cohort of pre-vaccinated and trained workforce of first responders including public health staff and vaccinators, police and defence forces, who can form teams to respond immediately to an attack (assuming the attack strain of smallpox is vaccine sensitive). Non-replicating vaccine would minimise the risk of this approach. • Vaccine could be sent with a group of vaccinators who are immunised when the stockpile is mobilised to commence immediate ring vaccination. • Use of N95 respirators and other PPE for airborne disease precautions can accelerate deployment after vaccination.
• Consider mobilising people who have been vaccinated in the past, as they would respond to revaccination faster, and potentially be able to deploy earlier. More research is needed on this (34). • Ensuring training materials are up to date and available and that a critical mass of responders have recency in smallpox vaccination techniques prior to any epidemic occurring, to avoid delays during an epidemic. • Ensure adequate vaccine and bifurcated needle supplies and rapid scale-up plans. Given a 12-18 months lag time in vaccine production, in the event of a critical shortage, vaccine dilution can be considered while formal vaccine surge production is commencing (35). • Physical security of vaccine and antiviral manufacturing sites should be a priority, as these sites may come under attack in the event of shortages.
Rapid contact tracing is part of the ring vaccination strategy and requires planning. This will require human resources for contact tracing, physical space for quarantine, a protocol for transferring contacts who develop symptoms, and trained, protected public health staff to monitor contacts. On average, each infected case will have 10-11 contacts (36).
• Given that physical space requirements for case isolation may exceed available health system capacity (and will be a higher priority), alternative plans for quarantine of contacts should be made. • Mobilising adequate human resources to effect rapid contact tracing and quarantine is crucial, as health workers alone will not be able to do this and may also be affected by smallpox. The requirement for contact tracing will be an order of magnitude greater than for case finding. No health workforce in the world will have capacity for contact tracing by trained public health staff in a large-scale epidemic, therefore community volunteers will likely be required. • Capacity will be needed for daily monitoring and surveillance of contacts. This could include POCT devices and training in their use, in order to rapidly diagnose cases arising in contacts, but could be done with clinical protocols for triaging contacts who develop symptoms. Self-reporting systems for contacts who develop symptoms can also be used. • Use of home quarantine, coupled with follow up and surveillance of contacts may be the only feasible option in some settings. Contacts could be provided free mobile phones for communication. • Ensuring adequate food and other supplies for quarantined contacts.
• Plans should be in place for rapid transportation to health facilities of contacts who develop symptoms.
• Group quarantine can be considered for low risk contacts, and was used during the Ebola epidemic (37) • Engaging with communities rapidly, using transparent communication and all means of communication, including social media, is recommended. • Mobilise community volunteers to assist with tracking of contacts of cases, ensuring at least 60-70% of contacts are tracked and vaccinated. • Consider financial incentives for community volunteers -this approach was used during eradication.
• Vaccinate community volunteers and provide PPE.
• Establish systems to ensure quarantined contacts are followed up for development of symptoms, transferred to isolation if they become ill, or released from quarantine after the upper limit of the incubation period. • Generate the evidence with investigational tools to determine if vaccination is required for contacts of contacts.

Box 8. Protecting critical infrastructure and business continuity
• Given there is little pre-symptomatic transmission of smallpox, and people tend to be too ill to leave home if they have symptoms, the relative benefits of social distancing will be much less than for pandemic influenza or other infections with pre-symptomatic transmission. In a crisis with limited resources, it would not be an efficient use of those resources to invest in excessive social distancing.
The only instance where this may be useful is to contain an epidemic originating on an island, where containment within the island would be desirable. • Social distancing may be introduced as a formal public health measure during the early phases of a pandemic, including the closure of public spaces or schools and public events, avoidance of crowded spaces, delay of travel. • Personal protective equipment (PPE) was discussed. PPE was not routinely used during eradication, with vaccination being the mainstay of protection for health workers. However, it was felt in current times that health workers would expect to be provided PPE and airborne precautions. Community volunteers engaged in the response could also be provided with PPE. • Preventing or at least slowing the movement of infected people across borders is a strategy that is included in most pandemic plans for influenza, though the emphasis varies greatly depending on the circumstances of specific countries. This approach should also be applied to smallpox. Such measures are more relevant for island states than for countries with multiple porous borders. • In the case of the Pacific or an island being the site of attack, sequestration of an affected island could be considered to stop outward transmission. • Border closure, or protective sequestration, was effective at slowing and even preventing entry of pandemic influenza into some Pacific Islands in 1918 (38). Faced with an epidemic of the severity described here, complete border closure for unaffected island countries such as NZ could be highly cost effective, even if such closure had to be extended for many months (39,40). • Entry and exit screening may also have a role in protecting island countries from severe epidemics. Entry screening generally has limited effectiveness because of the asymptomatic incubation period of smallpox (and most other infectious diseases), but combined with rigorous quarantine might be effective for islands with low visitor numbers (41). Exit screening could also be considered to reduce transmission of smallpox between islands and island states. For example, NZ Ministry of Health guidelines include the option of exit measures, such as screening, for passengers departing on ships and aircraft from NZ to small Pacific nations, especially those where most of the air traffic is via New Zealand (42). • Thermal image scanning and other forms of entry and exit screening at airports may be more useful than for influenza, as the prodrome (prior to the rash erupting on the skin) involves fever (43,44). Whilst people who have symptoms generally would be too unwell to travel, someone on a long-haul flight may board with mild or absent symptoms but may be severely ill upon disembarkation. However, people may also disembark during the incubation period and become sick afterwards. Protocols for managing for these possibly exposed asymptomatic people need to be in place. • Airport protocols are required for triaging and isolating sick individuals and for protecting customs and immigration staff.
• Plans are required for border control and management of unofficial/illegal maritime transport, which may increase in the Pacific region in the event of travel bans. • Travel advisories and impact of travel bans should be considered.
• Plans for evacuation of nationals from foreign countries should be negotiated.
• During air travel, the challenges of reaching travelers is inherently difficult due to factors including the volume of travelers and dispersal to distant points across the globe, language barriers and challenges related to public trust. Information provided on screens, monitors or static displays at airports in boarding or arrival lounges may be one of the best methods of raising awareness of public health issues. • In the Mataika exercise, Fiji may issue a travel alert, providing health information at airports and other neighboring countries that have direct flight to and from Fiji. The aviation sector in Fiji can also provide sample scripts to be read on board aircraft. The use of consistent public health messages by countries and the air industry increases the potential for traveler awareness of potential risks and actions to take. • Travelers should also be advised to visit a travel health clinic or international vaccination centre to collect health information about the country they are going to visit and be vaccinated if needed. A key feature of successful communication is the "one voice" where any one agency provides consistent and timely information.
• The vulnerability of critical infrastructure during bioterrorism or epidemic events is not often considered in health sector planning and preparedness, which may assume these systems are functional. Given the heavy reliance of certain critical infrastructure on normal social interaction, interaction points are in reality numerous and significant. • The proper functioning of key critical infrastructure -such as telecommunications, bulk transportation, essential emergency services, electricity, water collection and distribution, waste services, and health care services -depends on a segment of the working population to carry out essential operator, maintenance and sustainment tasks. • Absenteeism in sectors delivering critical infrastructure of only a few percentage points will cause both direct and indirect impacts on service quality, error capture, and fault correction. • There is a need to identify interdependencies between critical infrastructure in normal operational states and estimation of likely interdependency and collective system failure states when under stress such as during significant epidemics. Risk controls should be directed at not only individual infrastructure vulnerabilities, but also systemic and interdependency vulnerabilities identified through these processes. • Maintain emergency governance to coordinate the response.
• Cascading failures, characterised by the alignment of multiple failing components of critical systems supporting society, are likely.
• Rewards for presenting to and undertaking critical infrastructure work duties, as well as punitive measures for failing to report to work, may be necessary. • Developing resiliency, flexibility and depth in critical infrastructure staffing pools to draw from during a crisis will be helpful.

Conclusion
Exercise Mataika enabled cross-sectoral expert input into considering many aspects of a smallpox release and subsequent pandemic. We provide a framework for identifying and focusing on factors potentially within our control along the entire spectrum from pre-attack to recovery, from intelligence, legislation and law enforcement to public health measures and social mobilisation. We recommend that critical weak points be mitigated with prior careful planning, maximising prevention of planned attacks through intelligence gathering, and optimising a timely response and the recovery phase, whilst recognising substantial physical, infrastructure and human resources surge requirements in a pandemic. The exercise also highlighted the importance of international cooperation and the tensions which may arise between this need and domestic responses within each country, especially regarding the WHO pledged vaccine stockpile. Preparedness for a potentially catastrophic epidemic requires an inclusive and collaborative approach with all first response sectors and across nations, rather than a health-centric, localised approach to planning. Traditional planning focuses predominantly on medical counter-measures after an attack has occurred.
The impact of an epidemic and subsequent pandemic of smallpox would be substantial if arising in a low-income country with weak health systems and may have a very long duration. Practical aspects, like communication, the need to use community First responders include clinical health workers, public health workers, defence forces, police, paramedics, emergency services, fire fighters, customs and immigration staff, and workers in critical infrastructure such as energy. Each are equally important to the response and their protection must be planned for. The capability of first responder sectors is critical to an effective response. Workers may refuse to work if they do not receive adequate protection.
• Stockpile supplies of PPE for all first responders. In the absence of vaccinated responders at the beginning of an epidemic, PPE is the only protection available. The precautionary principle should be used and airborne precautions should be used (45). The last case of smallpox involved probable transmission through air conditioning ducts to a photographer working on the floor above a smallpox laboratory in the UK (46). Other studies have also documented variola in air samples (47,48). • At a minimum, stockpiles should contain enough vaccines (replicating and non-replicating) and antivirals for all first responders (not just clinical health workers). • Consider pre-vaccinated personnel who can form an immediate response team comprising of clinical health workers, public health workers, defence forces, police, paramedics, emergency services, customs and immigration staff. • Police, paramedics and customs staff are likely the least prepared but should be given equal protection to ensure response capability.
They may consider their own stockpiles. Defence forces in some countries hold their own stockpiles. • Protocols for decontamination of first response sites and ambulances should be in place.
• Energy supply and critical infrastructure for first responders should be ensured.
• Location-specific medevac protocols will be required to ensure a clear understanding and actions associated with infected responders.
• Burial procedures and protocols are needed, as well as supplies such as protective body bags (treated with hypochlorite).
• Recognise that pandemic events may last 1-4 years or longer, requiring long-term stockpiles and supply plans for drugs, vaccines and personal protective equipment. • Public-private partnerships for accelerated research and development during epidemics may be important.
• In a pre-epidemic period, research on optimising the cross-sectoral response to such an attack should be conducted.
• Exercising the response should go beyond the health sector response -joint exercises with health, defence, law enforcement, emergency management and community representatives would assist in planning and defining roles and key needs (49). • Coordination of technical support to affected countries should occur via Global Outbreak Alert and Response Network (GOARN), but may be challenged by reluctance of countries to provide support in the context of heightened uncertainty around likelihood of geographic spread.
In addition to social mobilization (described above under case finding and contact tracing) other areas for planning include: • Informing the public and travelers about disease control measures that may be in effect (eg. introduction of voluntary isolation, location of treatment centers that are open for ill individuals, hand hygiene, and early treatment). • Public Health and other agencies take responsibility for communicating potential public health risks in a timely and appropriate manner. Sometimes the message fails to reach the intended communities, including those people most at risk of the diseases and frontline workers. • Consistent messaging should be used by all relevant agencies.
• Volunteers in the response could be provided financial and other incentives.
• Civilian preparedness programs should be considered in non-epidemic periods to provide surge capacity of trained civilians who can take up various roles in the response. volunteers, requirements for case isolation, protection of first line responders, vaccination strategies, international cooperation and having surge capacity in both personnel and physical facilities should be central to planning. Planning for such an event is often based on assumed probability of the event alone. However, risk analysis is required which considers impact of such an attack and other factors such as human to human transmission potential (50), intent and capability. We know that there is declared intent for bioterrorism attacks against Western societies (4). There is also capability for such an attack, given the recent publication of synthetic biology methods to manufacture a virus very similar to smallpox. The genome of the smallpox virus, variola, is publicly available. We do not know if those with intent have the capacity to generate variola virus in vitro, but the possibility is higher now than any time in the past. As synthetic biology and genetic engineering technology continues to advance and become cheaper and more accessible, the risk will continue to increase. The principles of identifying influential and modifiable factors along the entire timeline of an event (from planning of an attack to recovery) and focusing on these factors for preparedness can be applied to any serious emerging infectious disease threat

Introduction
Smallpox is a potential weapon for biological warfare or bioterrorism. Therefore, after 9/11/2001 and the published claims made by the former head of the Soviet Biowarfare program (1-3), public health experts agreed that a safer yet fully effective vaccine against smallpox is needed. Shortly after 9/11/2001, the Department of Health and Human Services created the Biomedical Advanced Research and Development Authority (BARDA). BARDA's mission is to encourage the development of vaccines, antibiotics, and antivirals for agents which might become weapons of biowarfare. BARDA was also given funds to purchase such products and create a National Stockpile. Smallpox was high on the list of agents in BARDA's early interest, because only a few million doses of Wyeth Dryvax were available for use.
In last 17 years a great deal of modern virological research and development has been devoted to this work. Funds have been made available to stock newer vaccines in the US Government's National Strategic Stockpile. This stockpile now contains enough smallpox vaccine to immunize every man, woman and child in the United States. This paper reviews this effort. Readers needing a complete exposition of the virological and genetic information about third and fourth generation vaccines are directed to the chapter on Smallpox and Vaccinia in the 7 th Edition of Plotkin, Orenstein, Offit, and Edwards's textbook Vaccines (4).

First Generation Smallpox Vaccines
In the last half of the 20 th Century, first generation smallpox vaccines in the United States were largely preparations of the New York City Board of Health (NYCBOH) strain of vaccinia. These vaccines were of proven effectiveness, although there has never been a controlled trial of any vaccine against smallpox. They were administered by scratch or multiple pressure and, after 1965, largely by using the bifurcated needle.

Vaccination Technique
Most first generation vaccines used in the United States after 1965 were lyophilized for better shelf life. This enhanced their effectiveness when taken into tropical areas. A bifurcated needle is dipped into liquid vaccine (lyophilized vaccine after diluent has been added). Capillary action draws a droplet of vaccine into the crotch of the needle. About 15 firm but gentle downward strokes onto the skin of the arm near the insertion of the deltoid muscle are made. The first of these strokes dislodges the droplet of vaccine. The subsequent strokes through this droplet abrade the skin and allow entry of the vaccinia virus into the Malpighian layer. A tiny droplet of blood is often visible at the site, but frank bleeding indicates technique that is too vigorous. The site may be covered by a loose dressing, which helps avoid transfer of the virus to other sites (5,6). Good technique with fully potent vaccine causes a major skin reaction to develop by about 7 days. This is a central lesion surrounded by visible inflammation, the so-called "take". This is a viral infection, and thus there is frequently some mild fever and discomfort around 6 to 8 days after inoculation. (5, 6)

Immunity
Immunity following vaccination include both humoral antibodies and cellular immunity.
The orthopoxviruses are the largest viruses that infect humans. Their structure and functions are complex. The relationship between the various circulating antibodies against vaccinia virus antigens and the several cell-mediated immune responses evoked are complex. This is an area of active research, with many animal species utilized and hundreds of HLA class I and II epitopes identified (4,7). No studies have definitively shown what level of antibodies, or what form of cellular immunity, is fully protective. We thus also don't know precisely how long immunity lasts. Currently the most accepted measure of neutralizing antibodies is PRNT (3,4,6,7,8). These antibodies become detectable by about the 6 th day after vaccination and seem to last for several decades. Cellular immunity has been measured in several ways, with different researchers employing different tests for quantifying it. Such immunity lasts for several decades (4,6).
Epidemiological evidence suggests that some residual immunity lasts for decades after a single primary vaccination. Patients several years after from vaccination sometime acquire smallpox, the disease is generally mild and death rates are low until about five decades post vaccination (7,9,10).

Complications and Contraindications for Smallpox Vaccination
Anyone who has had a documented, face-to-face contact with a smallpox patient should be vaccinated. There are no contraindications to vaccinating such patients because smallpox is always worse than any complications of vaccinia. There are several complications that can follow vaccination because vaccinia is a viral infection. Anyone who has not had direct contact with a smallpox case must be screened for contraindications (11,12,13).
Like many viral infections, vaccinia can occasionally cause post-vaccinial encephalitis, about one or two cases per million primary vaccinees. Encephalitis is more common in infants than in older patients, so infants should not be vaccinated unless they are in contact with smallpox patients.
Patients with eczema or a history of atopic dermatitis can develop eczema vaccinatum. This can be fatal, particularly for infants, in whom eczema vaccinatum can act like a serious burn with loss of protein and electrolytes. Thus, patients with atopic dermatitis should not be vaccinated, nor should family members who have close contact with them.
Patients with diseases or conditions that compromise their immune systems and those who are taking immunosuppressive medications are at risk for developing progressive vaccinia. This condition is often fatal, with vaccinia virus growing out of control and often spreading throughout the body.
Vaccinia does not increase fetal wastage or cause prematurity. However, since it is a viral infection, pregnant woman should not be vaccinated unless they have had direct contact with smallpox patients.
Vaccination leaves live virus on the skin and the developing Jennerian vesicle sheds copious amounts of virus. Infants often scratch the vaccination site and transfer vaccinia to areas such as the eye. The vaccination site can be covered with a loose dressing to reduce such spread.
Good photographs of patients with serious complications of smallpox vaccination can be found in Fenner (6), or at the smallpox section of the CDC website (14).
With no smallpox occurring in the US after 1949 and recognizing the frequency of complications of vaccination (11,12,13), the United States abandoned routine smallpox vaccination after 1971-72 (15).

Production of First Generation Vaccines
First generation vaccines were produced using technique that would prohibit licensure today (6). The skin of animals, such as cows or sheep, was shaved and then widely inoculated with vaccinia. The resulting inflammatory exudate was scraped off about seven days after inoculation. This exudate contained animal proteins, bacteria, and possibly unknown animal viruses. The FDA would not currently license such vaccines. Thus, there is a need for vaccines that can be produced using methods that meet modern standards of good practice.

An Ideal New Vaccine
An ideal new smallpox vaccine would be a live virus that could be lyophilized to prolong its shelf life. It should be amenable for administration via a bifurcated needle. It should produce a visible major reaction ("take") so that successful vaccination could be documented without requiring laboratory work. It should be made in standard cell cultures in large volumes. Clinical trials with such candidate vaccines should produce data showing fewer and less serious complications than first generation vaccines.
Proving the efficacy of new smallpox vaccines will be difficult. There are no simple markers for full effectiveness. Primary vaccination produces an array of circulating antibodies and a complex group of markers of cellular immunity (4,6,7). Eradication of smallpox has made field trials of efficacy impossible. Therefore, the FDA has developed the "Two Animal Rule" to substitute for direct data on efficacy. (16) This rule requires vaccine candidates for licensure to show efficacy in two animals in which infection with an orthopox closely related to smallpox has some similarity to humans' infection with Variola major.
There is no animal model in which infection with live Variola virus produces a disease quite similar to human smallpox. The orthopoxviruses that are virologically similar to Variola include monkeypox, ectromelia, buffalopox and vaccinia. These do not cause diseases in mammals (including non-human primates) that is similar to Variola in humans (17,18,19,20). Jahrling has used large intravenous innocula of live Variola virus in monkeys. This often produces lesions on the skin similar to smallpox but does not include the widespread replication of the virus in reticuloendothelial tissues (18). These higher primates are expensive and difficult to work with, requiring special lab facilities. Such studies employing live Variola virus must be performed in the high security lab at CDC in Atlanta and require permission from the World Health Organization. Thus, meeting the twoanimal rule will be difficult.

Second Generation Smallpox Vaccines
Second generation vaccines are vaccinia strains that are clones of first generation strains of vaccinia of known effectiveness, but are grown on tissue culture and are thus free of bacteria and animal proteins. There are several such vaccines but only one, ACAM2000, has been subject to non-inferiority trials comparing it directly with the first generation vaccine Wyeth's Dryvax (a New York City Board of Health vaccinia vaccine that was used extensively in Africa and Asia during the Smallpox Eradication Program) (7). Dryvax is now known to be a mixture of closely related vaccinia strains, a single one of which was picked to yield ACAM2000.
Straightforward non-inferiority trials allowed licensure. Non-inferiority of ACAM2000 to DRYVAX was shown by measurement of neutralizing antibodies, rates of major reactions ("takes"), and measures of cellular immunity (7,21). Such head to head comparison trials employing Dryvax or other first generation vaccines are no longer possible because of the documentation of myopericarditis following vaccination with both first generation vaccines and ACAM2000 (22,23,24,25). ACAM2000 is now licensed in the United States and BARDA has added several hundred million doses to the United States Government's National Strategic Stockpile (26,27).
There continue to be safety concerns about the use of first and second generation smallpox vaccines. There has been an increase in the prevalence of eczema since the studies of complications of vaccination performed in the 1960's (28). There has been a dramatic increase in the prevalence of immunocompromised patients, given HIV, oncological treatments, organ transplants and other conditions which jeopardize the immune system (28). Patients with severe immunological defects are at risk for developing progressive vaccinia, in which vaccinia virus continues to grow unchecked, frequently resulting in death. These concerns have led to vigorous efforts to develop third generation vaccines.

Third Generation Smallpox Vaccines
Several third generation vaccine candidates have been developed (4). During the 1960's, the Germans produced a vaccine called Modified Vaccinia Ankara (MVA). A first generation vaccinia strain derived from horses was passed 570+ times in chick embryo fibroblasts. The result is a live virus, but it does not replicate in human tissues. Thus, it acts somewhat like a killed virus vaccine. It does not produce a visible skin lesion. Modern genetic analysis shows that MVA has lost several genes from its parent vaccinia strain (29). Bavarian Nordic markets its MVA strain, IMVAMUNE, and has completed several trials in humans to demonstrate safety. IMVAMUNE has a potency of 10 8 TCID after reconstitution. Optimal immunity requires two doses of 0.5 ml reconstituted vaccine delivered subcutaneously. Bavarian Nordic's many trials have included patients with HIV and patients with a history of eczema. They have employed several dosing schedules, although none have included young children (30)(31)(32)(33)(34)(35)(36)(37)(38).
MVA is not an optimal vaccine for controlling smallpox outbreaks. Optimal immunity requires two doses of MVA administered subcutaneously at about four weeks apart. IMVAMUNE is supplied in individual vials containing 0.5 ml of vaccine with 10 8 TCID per dose. It must be refrigerated up to the time of use and must be injected with a needle and syringe. It does not produce a visible skin lesion. Meticulous records must be kept because health workers cannot tell at a glance whether a patient has or has not received the initial dose. Since optimal levels of immunity require two doses, contacts of cases may not be protected after initial processing and their first inoculation. IMVAMUNE may be a good vaccine in situations where there is no smallpox, but people with contraindications to vaccination with second generation vaccines require vaccination. These might include laboratory workers exposed to orthopoxviruses and medical workers who might form teams of caregivers during an actual smallpox outbreak.
The Japanese have developed a third generation vaccine named LC16m8, derived from first generation Lister strain vaccinia. While there are fewer published trials than with MVA, they have used LC16m8 extensively, and have experienced few serious complications (39,40,41,42). This vaccine would be good for outbreak control. It is lyophilized and can be used without refrigeration and administered with a bifurcated needle. It produces a visible major skin reaction at the site of vaccination, so that a successful vaccination can be documented at a glance.

Fourth Generation Smallpox vaccines
Many new vaccinia-derived strains have been developed by genetic engineering. Several third and fourth generation vaccines have been created by deletion of genes from vaccinia or from creating strains that have epitopes common to variola or vaccinia (4,43,44,45). These are under development in laboratories, with a few that have progressed to animal experiments. Mostly, these have employed small mammals and challenged them after immunization with viruses such as vaccinia or ectromelia.
While these candidate vaccines may not result in a licensed immunologically excellent vaccine (see below), the interest in them may continue as we increase our understanding of the immune response, including which epitopes seem to be protective against other orthopoxviruses (45).

Barriers to Developing Newer Smallpox Vaccines
While it would be good to have a safe and effective new vaccine to supplement or replace ACAM2000, development of such a vaccine is doubtful despite excellent viral generic and immunological work. In the absence of actual smallpox or credible threats of bioterrorist attacks, there probably is no market for such a vaccine, and indeed funding for research in this area is limited. Large scale production facilities capable of producing large lots do not exist and would be costly to create and operate. DA Henderson has estimated that the lab research, human field trials, large scale production, storage and marketing of a new vaccine would cost between $750 million and $1.75 billion US dollars (46). More recently, Koblentz pointed out that the US Government is the only possible purchaser of a new vaccine and BARDA has not shown interest in adding to the current National Strategic Stockpile (47). Indeed, BARDA is more interested in ensuring that the large stocks of ACAM2000 and IMVAMUNE will maintain their potency for a very long shelf life.
Given the need for vaccines against HIV, Ebola, SARS, Zika, and other viruses with serious potential as public health problems, justification of diverting the funds and expertise to create and actually produce an improved smallpox vaccine seems highly unlikely.
While first generation Dryvax costed less than a penny a dose during the Smallpox Eradication Program, new vaccines would be much more costly. The newer second and third generation vaccines that have been purchased by BARDA for the National Strategic Stockpile would be free to the public and only used after a documented need for vaccination. ACAM2000 and IMVAMUNE prices are not available, but from the amounts bought by BARDA for the National Strategic Stockpile, we can estimate that their cost is between $4 and $17 per dose.
Research on development of third and fourth generation vaccines will probably progress. Vaccinia and its many artificial variants such as MVA and NYVAC are large stable DNA viruses, relatively safe and easy to work with (4,41,42). Given their safety in humans, they may be excellent vectors for other vaccine antigens. In animal models, MVA vaccines are immunogenic and protective against various infectious agents, including HIV, simian immunodeficiency virus, influenza, parainfluenza, measles, malaria, tuberculosis and several cancers (48). An NYVAC based vaccine against HIV shows promise (49).
MVA and other engineered fourth generation viruses such as NYVAC probably have more of a future as engineered vectors than as smallpox vaccines (48). There have recently been infections with other zoonotic poxviruses documented, and the newer smallpox vaccines may be needed if one or more of these new viruses become pandemic or epizootic (51).

Summary and Conclusions
Given the problems of serious side effects and outmoded production methods, the first generation of smallpox vaccines, despite their proven effectiveness, are not now acceptable. Second generation vaccines whose effectiveness can be assumed because they are made with the same vaccinia strains as first generation vaccines have been created. One, ACAM2000, has shown non-inferiority to first generation vaccine and has been added to the National Strategic Stockpile.
Third generation vaccines, which are derived from first generation vaccinia strains by serial passage in non-human tissues or by genetic modification of such strains in modern viral genetic laboratories, show promise as practical vaccines. MVA (Modified Vaccinia Ankara) has undergone several trials for safety in humans, including those who are HIV positive or have atopic dermatitis. It may be a good vaccine for use in persons who have contraindication to vaccination with first or second generation vaccines, but who require vaccination. MVA has been added to the National Strategic Stockpile. The Japanese vaccine LC16m8 seems good for outbreak control because it can be lyophilized, administered with a bifurcated needle, and produces a visible major reaction on the skin that proves its "take". LC16m8 has not yet been submitted for licensure in the United States.
There are many fourth generation vaccine candidates, produced by modern immunologic and virologic techniques. These are subunits of vaccinia with several genes removed, or vaccines created de novo by adding various epitopes or other immunogens from vaccinia to artificially created molecules. The cost and difficulty in proving that such vaccines are effective against smallpox may inhibit their full development as smallpox vaccines. They may prove very good as "vector vaccines" for other infectious agents because immunogenic parts of such agents can be added to their genetic structure.

Introduction
From 4-15 April 2018, the Gold Coast (GC), Australia, hosted the 21st Commonwealth Games (the Games), which was the largest sporting event in Australia this decade and ever held on the GC (1,2). Mass gatherings, such as large multi-sport events, increase the complexity of public health surveillance and response efforts due to the increased potential for communicable disease transmission, outbreaks from gastroenteritis and influenza, and pressure on the existing health system (3)(4)(5)(6)(7). The Gold Coast Public Health Unit (GCPHU) was the local agency responsible for providing public health surveillance and outbreak response for the Games during the operating period, 20 March-18 April 2018. Field Epidemiology Trainees (FETs) from the Master of Philosophy in Applied Epidemiology (MAE) program, Australian National University, have previously provided surge capacity for mass gathering events, such as World Youth Day 2008 (8). To meet the increased demands of the Games, current FETs were deployed to assist the GCPHU in the enhanced surveillance of potential communicable disease threats. The MAE program, Australia's Field Epidemiology Training Program, is a two-year research degree program that trains field epidemiologists in the detection and response to public health problems. This perspective piece summarises our experience in the field.

Identified risks and surveillance
There was enhanced surveillance of diseases deemed to have a potential impact on the Games, including gastroenteritis and influenza. Norovirus infection was highlighted as a public health risk following outbreaks at recent high-profile international sporting events (5, 7). The surveillance system consisted of strategies which were developed for, existed prior to, or were enhanced for the Games. An Emergency Department Syndromic Surveillance System (EDSSS) and electronic questionnaire provided a mechanism to characterise hospital presentations (9,10). Daily counts of defined illnesses were received from sentinel health services including private Emergency Departments, general practices, pharmacies, telephone helplines, and partner agencies' reports. Enhanced international communicable disease surveillance activities provided daily analysis and reporting of communicable disease threats that could potentially affect health in Australia, particularly at the Games. Routine disease notifications were also a key information source.
A robust risk assessment process involving the GCPHU's epidemiology, communicable disease, and environmental health teams ensured that identified signals were followed-up with further investigation and public health action. Although an increase in gastrointestinal reports was identified, no Gamesrelated outbreaks were detected. The implementation of a sensitive and well-planned surveillance system and coordination of timely public health action feasibly contributed to this success. Selected surveillance systems continue to operate in the GCPHU, leaving a post-Games surveillance legacy.

Operations
We played a key role in operational activities in surveillance and the Emergency Operating Centre (EOC) during the operating period. FETs were independently responsible for surveillance, analysis, reporting, and communications as part of a rotating roster within teams.

Surveillance and outbreak response
Pre-Games surveillance efforts included testing surveillance systems to improve readiness and identify gaps. During the Games, outbreak prevention activities included: monitoring EDSSS data, analysing electronic exposure questionnaires, undertaking investigations and contact tracing, assisting with Situation Reports (SitRep), and following-up with sentinel sites. Daily surveillance involved conducting, recording, and analysing three-day food histories from gastroenteritis cases to identify clustering and determine common exposures. Monitoring surveillance data to identify potential signals and risks guided food and water venue site inspections and public health action.

Emergency Operating Centre
The GCPHU EOC was the single point of contact during the operating period. Prior to the Games, processes such as a daily rhythm of meetings and reports and communications with key stakeholders were agreed upon by all involved. During the Games, the EOC coordinated horizontal and vertical information sharing with stakeholders, including the Organising Corporation, delivery partners and emergency services. The EOC also facilitated incident management and response between the various GCPHU teams. Key daily outputs included a centralised incident log, records of activities and decisions, a communications SitRep, and Ministerial briefings to ensure stakeholders were informed of public health threats and response efforts by the GCPHU.

Lessons learnt
Being part of a localised epidemiological unit provided us with hands-on experience in mass gathering surveillance. We were part of high-level operational meetings where we could directly observe emergency management structures, procedures and cooperation between response agencies. It highlighted to us that successful mass gathering surveillance requires thorough planning and preparation of systems, early collaboration with stakeholders, robust data management, multi-disciplinary teams, and strong communication and leadership.
A key strength of the enhanced surveillance system was the cohesive integration of diverse multidisciplinary teams, which enabled systems to function effectively and fulfil their goals. This was achieved through clearly designated responsibilities and reporting lines, the collective celebration of achievements, and a spirit of cooperation. Being part of an experienced and inclusive team environment facilitated our skill development in the collection, analysis and interpretation of surveillance data and allowed us to contribute to actionable disease intelligence, evidence-based decisions, and public health measures.
Future mass gathering surveillance and response systems need to be flexible to adapt to changing priorities and workloads. One recommendation would be the implementation of a communication system that regularly relays impromptu changes to internal processes either via a daily update email, as a daily meeting agenda item, or a rolling, centralised standard operating procedure that records what the agreed processes are on that day. Processes could also be put in place to allow staff movement to share the workload in areas of most need and facilitate skill development.
We came from across Australia and from a wide network of health and research institutions to fulfil a temporary resource base required by the GCPHU, enabling additional Games surveillance and outbreak response to occur and routine public health investigations and surveillance to continue unaffected. FETs were able to merge quickly into a team environment, develop critical relationships, and undertake public health action. Our experience emphasises the importance and benefits of FET training and engagement for surge capacity in resource-limited and emergency settings, enhancing public health system readiness for high-impact future events. Working under the direct supervision and mentorship of experienced epidemiology, communicable disease, and environmental health staff provided us with a broader perspective of field epidemiology and practical skills to become Australia's future field epidemiologists and public health leaders.  (2) involving the SES as the control agency. There is no statutory authority in Australia to govern emergency responses, rather each jurisdiction runs its own state emergency services in alignment with the national emergency plan (2). SES units are tasked with emergency responses at a local level but can, and do, join with other units and deploy interstate and overseas. All volunteer emergency workers in Australia receive nationally standardised training in first aid and general rescue.
In the AIIMS framework the SES' role, in addition to being the control agency for flood and storm events, is to support other emergency agencies with member participation, resources, and welfare. During the Black Saturday bushfires (Victoria, February 2009) the SES were tasked with providing and administrating temporary field bases, communication infrastructure, meals and peer support to all agencies. During multiagency responses the SES may also provide operational support in the form of building stabilisation, clearing of vegetation, traffic management and crowd control, and working very closely with the Red Cross in managing and co-ordinating evacuation centres and provision of welfare. In the last year Victoria's SES have provided operational assistance to the agency in control (local government council and Metropolitan Fire Service, respectively) in the form of boat crews and staging area management in several environmental incidents including a blue green algae bloom in the Melbourne water catchment area (November 2018) and in a large tip fire in east Melbourne (April 2018). In short, the SES provide a willing and able workforce that forms the backbone of a response to any given incident, anywhere in Australia.
Unfortunately, SES volunteers, due to the nature of their supportive background role, tend to disappear into that background and are often overlooked in emergency management planning. Individual state's emergency plans for pandemics (3)(4)(5)(6)(7)(8)(9) do not mention volunteer emergency services at all, except for a single line to the effect that the SES are to provide assistance "on request". The national plan for human epidemics does recommend that "planning should review the potential for use of community groups and/or volunteers who could provide assistance to health professionals and organisations in severe situations."(10) Without any formal engagement of the health care sector and the volunteer emergency sector it is hard to imagine that the health care sector is cognisant of the emergency response capabilities of the SES that could be called upon in a time of need.
Emergency management research has shown that a key enabler in emergency operations is task sharing or division of labour (11). During the Ebola health crisis in Sierra Leone (2015) it was demonstrated that providing healthcare professionals with logistics and operational support such as supply chain assistance and building repair allowed healthcare professionals to perform the task of providing healthcare more efficiently (11). The SES have an enormous capacity and experience for providing infrastructure and logistical support of the kind that could be utilised by the Australian health care sector in an emergency. For example, a personal communication has revealed that in an influenza pandemic, when the main hospital capacity is overwhelmed, a quaternary centre in Melbourne will deploy a field hospital utilising manpower derived from the hospital workforce. A better, more strategic, use of resources could be to outsource the deployment and management of a field hospital to the SES. Studies have shown that hospitals largely overestimate their surge capacity (12) and are likely to be overwhelmed during an emergency. It is highly likely that smaller regional and rural centres will also very quickly become overwhelmed, even in a small epidemic, and would benefit from voluntary workforce assistance. In a fast-moving pandemic situation, voluntary workers could also be utilised in performing contact tracing and case finding where the Department of Health workforce capacity is overwhelmed.
Volunteer emergency personnel have any number of capabilities that can assist with aspects of an emergency response from resilience, to first response, to the recovery stage. The SES are frequently observed assisting other emergency agencies with tasks such as: holding a Crane M. The Invisibles: the role of volunteer emergency service members in human health emergency response. Global Biosecurity, 2019; 1 (1) bag of fluid for a paramedic at a motor vehicle accident; providing lighting for a forensic team at the site of a deceased person; waiting with the deceased until the coroner arrives; carrying injured parties out of remote bushland; door knocking in flood prone areas to deliver emergency preparedness education-the list is virtually endless. How these skills are applied to assist in health emergency is limited only by imagination. Utilising volunteer workforce capabilities should be considered earlier and more comprehensively in the planning process, rather than scrambling to arrange assistance mid-emergency. This will allow for proper volunteer training and risk management in addition to ensuring a timely and co-ordinated incident response. An audit of SES capabilities as they could apply to existing health emergency response plans is urgently recommended.
Currently, plans that deal with biological threats to humans have the state's health agencies as the sole control agency, with limited planning for expanding to a multi-agency response when hospital centres and the Department of Health become overwhelmed. Recognizing the role that volunteer emergency services can play in a multi-sectoral health emergency response with member participation, resources, and welfare would strengthen Australia's response to human health threats in future. It is recommended that the State Health Departments partner with their State's Volunteer Emergency Agencies on planning and training for human health threats to ensure appropriate sharing of knowledge and resources. Other options for civil preparedness should also be considered for improving surge capacity during an emergency.

About the Author
Dr Megan Crane is a medical researcher and State Emergency Service (SES) Volunteer in Victoria, Australia. She completed a PhD in Immunology at the Monash Institute of Medical Research, and did postdoctoral research in HIV and HBV at the Alfred Hospital, Melbourne. She has been volunteering with the SES for two years and has attended many incidents in a first responder capacity. This mass displacement event provided an ideal setting for large-scale outbreaks of communicable diseases. The refugee camps in Cox's Bazar endure monsoonal rains as one of the wettest parts of the world, prone to landslides and flooding (Image 1). These conditions contribute to the proliferation of diseases spread via person to person, vector-borne, airborne, waterborne and zoonotic transmission. Complex emergencies such as this also pose the risk of environmental health threats, gender-based violence, decreased mental health and a rise in noncommunicable diseases (6).   (11).
The identification of the first case of diphtheria was rapidly followed by one of the world's largest and protracted diphtheria outbreaks. As of mid-August 2018, the number of diphtheria cases exceeded 8,000, including 44 deaths. This outbreak marked the first major resurgence of diphtheria in the post-universal vaccine era in Bangladesh since the 1980s. By comparison, the world largest known diphtheria epidemic was recorded in the 1990s throughout the former Soviet Republics with over 140,000 cases and 4,000 deaths and the protracted outbreak in Indonesia (2011-2017) with over 3,000 cases and 110 deaths (7,8). This diphtheria outbreak in Bangladesh was unique in that it occurred among a largely undervaccinated refugee population living in overcrowded camps, and spilled over to the local host Bangladeshi community, which generally had a high (>90%) immunisation rate (9).
There were also other infectious disease outbreaks and health conditions that occurred concurrently with diphtheria. Table 1 provides a summary of key epidemic-prone diseases monitored using the Early Warning, Alert and Response System (EWARS), a web-based mobile application used for disease notification, outbreak detection and response in emergency situations (10).
The Australian Response MAE (ARM) network deployed three epidemiologists and three Master of Philosophy in Applied Epidemiology (MAE, Australia's FETP -Field Epidemiology Training Program) scholars to the response between January and April 2018 to support activities of the WHO's emergency operations in Cox's Bazar (2,12). The overall objectives of the GOARN deployees were to provide technical support for capacity building and training for improved prevention and control of outbreaks, while working in collaboration with other health organisations as well as the Bangladesh Ministry of Health and Family Welfare (MoHFW).
The WHO supported MoHFW in conducting activities to contain the diphtheria outbreak and prevent additional outbreaks. These activities included the provision of technical and logistical support for capacity building and training, strengthening disease surveillance, supplying essential medical supplies and laboratory materials, and adapting preparedness and response activities on disaster management to the local context. The GOARN team supported the WHO's mass immunisation campaigns for diphtheria, tetanus, measles, rubella, and cholera, and launched the Extended Program on Immunization through outreach clinics within the camps.
The main contributions of the Australian GOARN team included: 1) supporting the establishment and monitoring of the EWARS for diphtheria and diseases listed in Table 1. GOARN deployees conducted in-field risk assessments and supported health partners in the use of EWARS; 2) coordinating contact tracing within the camps and the host community; 3) coordinating the establishment of local laboratory capacity for diphtheria and selected epidemic-prone diseases; and 4) providing infection prevention and control assessment, training and support with a focus on anticipated potential health needs (13). Providing large-scale health support in crisis conditions predictably involved challenges due to lack of resources, poor infrastructure and limited capacity for patient follow-up in the refugee camps. Timely identification of patients and their contacts was a challenge due to poorly or unmarked dwellings leaving high likelihood of miscounting. The WHO-GOARN team established a strong network of partner agencies that acted as focal points for treatment, contact tracing and referrals for diphtheria and other outbreaks, and facilitated daily sharing of information between agencies.
The local host community has historically relied on basic health services that are considered substandard compared to urban Bangladesh (14,15). The host community had limited availability of public health laboratory services prior to this crisis, however this situation has improved with the establishment of the new laboratory in Cox's Bazar by a collaboration between the WHO case management team and the Institute of Epidemiology, Disease Control and Research (IEDCR). There was a rapid expansion and strengthening of primary health services and, as of April 2018, there were more than 200 health facilities across the Rohingya camps (Image 2). There are sensitive and complex political aspects of humanitarian assistance that must be navigated by all organisations involved.
As expected, language was an obstacle in effective health service delivery. Chittagonian, spoken in the Chittagong region and by local staff, is the primary dialect of the Cox's Bazar population, while other regions of Bangladesh primarily speak Bengali (Bangla), and the Rohingya speak the Rohingya language. The WHO office in Cox's Bazar operates in English, which promotes a chain of communication from English to Bengali or Chittagonian to Rohingya. The use of translators and local personnel aided communication, especially since Chittagonian is not dissimilar to Rohingya. Language barriers coupled with cultural differences made it difficult to communicate with affected populations and highlighted the role of building capacity by identifying and training bilingual local staff.
Our participation in this emergency response demonstrated a strong partnership between several players: volunteers, front-line healthcare workers, epidemiologists, laboratory specialists, immunisation experts, operations and logistics managers and highlevel policy makers. The experience for the Australian FETP trainees was a first glimpse of field epidemiology in a complex humanitarian emergency and acutely demonstrated the barriers and challenges for success in communicable disease prevention in a crisis. The impact and efficiency of the response in Cox's Bazar will improve over time with increased training of local staff, collaborative efforts of health organisations and the establishment of a greater number of health facilities, including vital services such as laboratories. While some may suggest that emergency response is like "blowing out the fire", a strong response based on underlying principles of capacity development and preparedness can lead to long term gains in outbreak response capacity. This is particularly critical in a crisis that is expected to continue for many years.

Introduction
From 4-15 April 2018, the 21 st Commonwealth Games (the Games) were held on the Gold Coast in Queensland, Australia. One third of the world's population was represented at the games by 71 teams from members of the Commonwealth of Nations (1,2). In addition to 6,600 athletes and numerous support staff, more than 500,000 spectators attended the Games across several sites (3). The large influx of travellers and the temporospatial concentration of people at a mass gathering provides the ideal conditions for introduction of international communicable diseases and heightened local transmission (4). As a result, it was essential for local, state and federal public health agencies to be prepared to detect and respond to communicable disease events at the Games. An important facet of communicable disease control was having an awareness of disease activity in countries from which people were likely to be travelling. With the aim of providing information for risk assessments and informing preparedness and response activities, enhanced international communicable disease surveillance was instigated for the Games.

Report format and distribution
The reporting schedule was informed by surveillance processes from previous mass gathering events, such as the Sydney Olympic Games in 2000 and the Glasgow Commonwealth Games in 2014 (Gold Coast 2018 Commonwealth Games Corporation, unpublished data, 2018). Surveillance began in January 2018, with monthly reports. Weekly reports ran through March and daily reports began two weeks prior to the opening of the Games and continued until four days after the closing ceremony. The frequency of reporting was designed to mirror the anticipated travel volumes during the Games period.
Surveillance reports were provided to the heads of Communicable Disease Branches in each State and Territory, Queensland Health epidemiologists, general practitioners, pathologists and microbiologists throughout Queensland, the medical functional area of the Commonwealth Games organisers and relevant staff at the DoH.
Reports were comprised of a line list of events including details on the country and region affected, the disease, susceptible populations, case numbers and a brief contextual description. All identified events continued to be monitored and updated where necessary.

Outcomes
Across the four-month period 27 disease events were identified and monitored. Notable events included: -The largest reported outbreak of Lassa fever in Nigeria, which was ongoing throughout the Games period. Of the threats highlighted in surveillance, only one isolated cluster of influenza materialised at the Games.

Discussion
Enhanced international communicable disease surveillance provided situational awareness for decision making and risk assessment at the local and national level during the Games. As a result of timely, sensitive and well-planned surveillance and response by the GCPHU, there were no Games-related outbreaks. International surveillance identified a variety of relevant disease events overseas and complemented surveillance of local disease activity during the Games. Fortunately, influenza was the only threat identified through surveillance that eventuated. Swift response activities quickly isolated and managed the cases.
Sensitivity, timeliness and data quality were all important facets of the surveillance. Use of a broad variety of data sources ensured that intelligence was reliable, thorough, gathered as quickly as possible, and that it was updated as situations became clearer through official sources.
Wide distribution of the surveillance report meant that all levels of government and a variety of relevant stakeholders shared the same information and an ongoing awareness of disease activity overseas. This provided the information necessary to prepare appropriately and contextualise presentations and perceived risks. Dissemination of the information allowed frontline health professionals to consider disease presentations that would not normally be included as differential diagnoses.
A standard operating procedure document was established during the project and serves as a legacy to inform future engagements of the DoH in providing international communicable disease surveillance for mass gathering events.

Conclusion
International communicable disease surveillance was an important facet of preparedness and situational awareness for the Games. Reports provided the necessary intelligence to inform risk assessments and enabled primary health care and public health systems to be sensitive to potential risks. International surveillance should be considered as a necessary complement to planning and local surveillance activities in future mass gatherings.

Introduction
Now more than ever in history, the immense growth in biological sciences and technology has led to a phenomenal increase in the creation and modification of infectious pathogens using genetic engineering (1). The research underpinning these scientific and technological advances is broadly referred to as Dual Use Research of Concern (DURC). DURC is research conducted with the intent of good but has the potential of causing harm. DURC includes research using technologies such as synthetic biology, clustered regularly interspaced short palindromic repeats (CRISPR) and gene editing (1)(2)(3). Despite the rapid expansion in DURC, relevant training on DURC and its associated risks have not up till now generally been included in undergraduate and postgraduate medical training curricula (4). We outline the global challenge of DURC from the perspectives of two clinicians and a health policy analyst with training from a developing (Nigeria) and a developed country (Australia). We then recommend the need for global guidelines and country-level national policy on inclusion of DURC training in undergraduate and postgraduate medical curricula.

The global challenge of DURC
Research classified as DURC has enormous benefits with respect to understanding mutations that lead to transmissibility, antimicrobial resistance, prevention of genetic disorders, and drug and vaccine development. However, it is also associated with a significant risk of intentional or accidental release of novel infectious agents, with far-reaching global consequences (1)(2)(3)(4)(5). With rapid international travel and trade, a single released infectious pathogen could, within a short period, spark unnatural pandemics with profound negative effects (illnesses and deaths) on individuals both within the country of release and in distant countries (1,6). DURC thus has a potential for extensive catastrophic health, economic and social outcomes, which transcend local and national boundaries (1,5). Hence, there is the need for a highly coordinated, dynamic and robust program that adequately engages all important stakeholders (clinicians included) that are directly or indirectly affected by or involved with DURC, as well as those charged with the responsibility of biosecurity.

The perspective of clinicians on DURC training as first responders
Despite being frontline responders in managing cases directly affected in outbreaks (natural or unnatural), few clinicians have training in DURC, its risks, or identification of red flags which may suggest a DURC associated outbreak. Examples of such past outbreaks which were not recognized by treating clinicians or public health authorities were the Rajneesh salmonella attack and Operation Seaspray (7,8). In the former, salmonella culture was added by members of the Rajneesh sect into salad bar constituents in eight restaurants in the Oregon county, United States of America (USA) (7). This resulted in 751 morbidities including 45 admissions. Similarly, Operation Seaspray was a secret experiment by the USA Navy in which Serratia marcescens was sprayed over San Francisco Bay region in California (8). This also resulted in 11 admitted cases of urinary tract infection, including one mortality. Despite these cases having some unusual features, neither managing clinicians nor public health experts considered those outbreaks as DURC related. Adequate training is crucial, not only to ensure optimal patient management but also to promptly alert public health experts, who are charged with the responsibility of conducting comprehensive outbreak investigations (9,10).
From our experience, undergraduate medical and residency training in Nigeria (a developing country) and Australia (a developed nation) conspicuously lacks training in DURC, its associated risks and peculiar challenges in recognition and management of cases in unnatural outbreaks. Indeed, a recent report from global experts in the field of bio-safety revealed that post-doctoral, graduate and undergraduate training of professionals in life sciences, including medicine, are devoid of discussions or courses about DURC, except in specific cases where a trainee is working directly with a select agent (4). Similarly, a recent survey in the United States of America among Administrators and Trainers in life sciences revealed about 59% of them had no knowledge of DURC, only 19% of them could define DURC, and about 22% of them were unsure of what DURC meant (11).
With endorsement of open publishing of DURC related research methods by the United States National Science Advisory Board for Bio-defense in 2012 (12), and the numerous 'Do-it-Yourself' laboratories constantly springing up in different parts of the world (4), DURC related threats appear to be inevitable. Hence, all relevant stakeholders should be adequately prepared to effectively tackle these threats whenever they occur. Physicians as first responders are stakeholders with obvious knowledge gaps in recognizing and tackling threats related to DURC and its outcomes.
A recent, highly publicized example of DURC highlights the current lack of adequate controls. A Chinese scientist (He Jiankui) purportedly altered the CCR5 gene to activate HIV resistance in a set of twin girls (13). This was reportedly achieved using CRISPR, a tool that can be used by scientists to excise, embed and change specific pieces of Deoxyribonucleic acid (DNA). He Jiankui's 'experiment' may have unknown and unquantifiable risks attached to it. Dr. Burgio Gaeten, a genetic researcher, describes these risks as "off-target effects" that could turn-off genes that maintain good health, including those that suppress cancerous growths (13).

Policy recommendation
Responding to the accessibility and ease of DURC is a global challenge. In our opinion, the World Health Organization (WHO) should develop guidelines on how DURC can be integrated into the medical curricula of physicians in member countries, which each country should adapt to fit their setting (14).
Additionally, country-level policy development is critical because of the diversity and complexity associated with DURC in each country. The policy approach for each individual nation should be country specific, with each country developing policies grounded in the WHO framework that can be implemented within her context (14). Also, we propose that a combination of top-down and bottom-up approaches involving physicians, public health experts, federal ministry of education, federal ministry of health, research and ethics committees, security and intelligence agencies and other relevant stakeholders, would be most appropriate to utilize in the development of curricula and policies.
The learning outcomes and content of DURC related medical curricula should clearly respond to differences in knowledge needs of undergraduate and postgraduate trainees. While for undergraduates, a foundational course to prepare them for DURC related challenges is required (15), postgraduate courses should be more in-depth with focus on heightened index of suspicion with unusual clinical patterns, DURC related ethics, research methodology and emergency response, among others. In addition, the importance of interoperability and collaboration with all other first responders and important stakeholders should be incorporated into the post-graduate medical curricula (6,16).

Conclusion
DURC is a rising global challenge, yet there is lack of educational curricula, knowledge and skills about DURC among clinicians who are important first responders. Clinicians should thus be adequately equipped with knowledge and skills to effectively manage DURC related medical threats. Country-level policies on integrating DURC training into the curricula of undergraduate and postgraduate medical trainee is therefore of paramount importance. Such training should also be considered for undergraduate and postgraduate science curricula and research degrees.
Feedback from operational stakeholders who manage or respond to outbreaks is that they are often too busy to review literature or obtain relevant background information to assist them with acute response. Unlike a traditional analytical outbreak investigation report, Watching Briefs are intended as a rapid resource for public health or other first responders in the field on topical, serious or current outbreaks, and provide a digest of relevant information including key features of an outbreak, comparison with past outbreaks and a literature review. They can be completed by responders to an outbreak, or by anyone interested in or following an outbreak using public or open source data, including news reports.

Title
Report on the 2018 Acute Flaccid Myelitis Outbreaks in the USA  (3,4).

Clinical features
AFM is a clinical syndrome that principally includes a sudden onset of acute limb weakness or paralysis (12). Other features of AFM include difficulty breathing, ataxia (unsteady walking), wobbliness, headache, stiff neck, dizziness and jerking movements (12). Fever can also be expected in 50% of cases (13).
Confirmation of AFM requires clinical evidence of "acute flaccid limb weakness" and "evidence of spinal cord lesions" within the grey matter as shown by Magnetic Resonance Imaging (MRI) (14). Evidence of increased white blood cell count within the Cerebrospinal fluid (CSF) may also be supportive, but not confirmatory, of AFM (15).

Mode of transmission (dominant mode and other documented modes)
AFM as a clinical syndrome may be caused by different pathogens and cannot itself be transmitted from person-to-person. Infectious causes of AFM include polio virus, non-polio enteroviruses EV-A71 and EV-D68 (12), some flaviviruses, adenoviruses type 21, and some herpesviruses including cytomegalovirus and Epstein-barr virus (12,16,17). Non-infectious causes may include environmental toxins or genetic disorders (17).
Transmission of EV-A71 is by the faecal-oral route, however the virus is also present and can transmit via respiratory secretions such as nasal mucus (18). EV-D68 has only recently been associated with AFM (19,20), and is primarily transmitted via the respiratory route and respiratory illness is the main clinical syndrome for EV-D68 infection (21,22). EV-D68 has been detected in faeces of some infected cases but faecal-oral route transmission has not been established like EV-A71 (23). Both EV viruses can survive on surfaces as fomites (24,25).
Other known causes of AFM include West Nile, Saint Louis encephalitis, and Japanese encephalitis viruses, which are all mosquito-borne flaviviruses and are primarily carried by Culex spp. mosquitoes (26).
Adenovirus transmission is similar to EV-D68, primarily via respiratory secretions. However, unlike EV-D68, transmission via the faecal-oral route has been shown (27).

Demographics of cases
The exact demographics of all cases in this outbreak are not publicly known. The CDC has reported that among cases confirmed prior to November 2 nd , 59% (47/80) were male (1). The median age of cases is 4 years (1). This is consistent with the consensus among reporting media indicating that 90% of cases affected are aged under 18 years (28). In the August-November epidemic wave, all cases in Washington are aged under six years (5), all cases in Minnesota are under 10 (8), and all cases in Illinois are under the age of 18 (6). In Colorado, Pennsylvania, and Ohio, all cases are described among "children" (7,11,29). There are no details on the cases in New Jersey yet.

Case fatality rate
There has been one suspected death from AFM diagnosed in May 2018, however the CDC has yet to confirm this (30). In the current epidemic wave, starting in August, no fatalities have been confirmed or reported.

Complications
Mild cases of AFM can expect to fully recover, however at least half of cases may have residual limb weakness requiring ongoing physical therapy (31). Severe cases of AFM may experience respiratory failure and require ventilator support (12). Rare neurological complications associated with AFM can sometimes lead to death (12). Long-term complications include paralysis and residual limb deficits.

Available prevention
The causes of AFM are varied and include both infectious and non-infectious aetiologies. As such, the CDC recommends a variety of prevention strategies including: staying up-to-date with vaccines, basic hygiene i.e. washing your hands, and protecting yourself from mosquito bites (4 A vaccine is available to prevent Japanese Encephalitis virus, however it is only recommended for travellers visiting endemic countries for longer than one-month and not available for infants under two months of age (34).

Available treatment
There is no available treatment for AFM, which is limited to supportive care only. This includes the use of assisted ventilation when necessary (12). Specialists may assist with physical therapy whilst hospitalised to reduce muscle weakness or loss (12). The antiviral Fluoxetine has been trailed as treatment for presumptive EV-D68 associated AFM. However, it was not associated with improved outcomes (35).

Comparison with past outbreaks
In recent times, large outbreaks of AFM were first documented in the US in 2014 (3). Reporting of AFM however remains voluntary in the US so the true incidence remains unknown. The clinical similarity of AFM to other neurologic syndromes, e.g. Guillain-Barré syndrome and idiopathic transverse myelitis, increases the chance of misdiagnosis, potentially concealing the true AFM incidence further (37). Interestingly, the onset of the 2018 outbreak coincides with time-of-onset observed in both previous outbreaks in 2014 and 2016: August to October (4). This suggests a two-year cyclical dynamic of AFM outbreaks and a seasonal pattern is emerging, meaning another outbreak in August 2020 might also be expected. This cyclical dynamic also supports the role of a viral agent as the cause of the 2018 outbreak rather than unknown sporadic or environmental cause, such as a toxin. Supporting this conclusion, the CDC released a report on 16 November 2018 based on the results of early diagnostic testing stating "clinical, laboratory, and epidemiologic evidence to date suggest a viral association" (1).
Among 125 confirmed AFM cases tested, 21 (42%) have tested positive for EV-A71, 16 (32%) positive for EV-D68, and 13 (26%) positive for other viruses types, including rhinovirus A subtypes and Coxsackie A viruses (1). Detection in the cerebrospinal fluid (CSF) is confirmative of the pathogen being the cause of AFM, however only two AFM cases among 21 specimens tested positive: one EV-A71, and another for EV-D68. While these numbers are limited, it is rare that EV-D68 and EV-A71 are detected in the CSF, even among cases with confirmed AFM and positive respiratory or stool samples. For example, in the 2014 outbreak, of the 120 confirmed AFM cases, only 12 had confirmed EV-D68 infections, 11 from respiratory samples, and one from CSF (19,38). This is likely due to the limited replication and persistence of EV virions in the CSF (20,39,40) and the delay between symptom onset and specimen sampling as observed in the 2014 outbreak (38) .Therefore, while CSF positive detections are confirmatory, negative CSF results does not weaken other clinical and diagnostic evidence for EV-A71 and EV-D68 aetiology.
Due to the emergence of EV-D68 as a likely cause of the 2014/16 AFM outbreak in US, it is possible that EV-D68 may be the cause of the 2018 outbreak. Clinical and diagnostic evidence provided by the CDC above supports this conclusion. Perhaps unusual in this outbreak is the simultaneous detection of EV-A71 among many AFM cases. Detection of EV-A71 in previous AFM outbreaks in the US is not known. In 2014, 11 Non-EV-D68 enterovirus species were detected in the stool of AFM cases, suggestive of EV-A71 infection as a faecal-oral pathogen but was not specified (38). In 2016, a single AFM case in Washington State tested positive for EV-A71 among 10 confirmed AFM cases (37). Similarly, there is little evidence internationally for co-circulation of EV-D68 and EV-A71 among AFM cases. For example, in France, enterovirus-associated AFM appears mostly associated with EV-A71, although EV-D68 has been detected previously in a single case (41). Of all the 38 states affected, Texas has recorded the most AFM cases (n=25), followed by Colorado (n=16). Based on public data and reporting, we cannot determine if the majority of confirmed cases in Texas had onset dates during the epidemic period August to October, or prior. In September, a Morbidity and Mortality Weekly Report (MMWR) was published describing an increase in EV-A71 associated encephalitis, meningitis and AFM in Colorado between May 10 and June 5, 2018 (42). During this period, 34 young children (<2 years) presented with neurological symptoms and tested positive for EV-A71; three were subsequently classified as AFM (42). In the current epidemic which began in August, of the 16 positive AFM cases, 11 have tested positive for EV-A71 (7). Only one cases has tested positive one for EV-D68, while two have tested negative for enterovirus species (7). This might suggest EV-A71 is the predominate cause of the current AFM outbreak in Colorado, while EV-D68 is only a minor cause there.
In 2014, like 2018, Colorado was also one of the first states to report an increase in AFM detections (43). This suggests Colorado may be a significant state for the emergence of AFM epidemics in the US. It is unclear if other states also experience simultaneous emergence of AFM or if a single state such as Colorado may act as a source, potentially disseminating the disease across the US. If the emergence is simultaneous, the environment and seasonality of non-polio enteroviruses may provide one explanation. Nonpolio enteroviruses are known to circulate all-year round, however peaks typically occur in summer between July and September in the Northern Hemisphere (44). Outbreak timing and transmission intensity has been shown to be associated with latitudes and dew-point temperatures such that northern states observed peaks later in the season whereas southern states observed a regular distribution of cases throughout the year. This supports the case for non-polio enteroviruses as the cause of the current AFM outbreak as most states affected since August epidemic wave are located in the Northern US: Washington, Illinois, Colorado, Minnesota, New Jersey, Pennsylvania, and Ohio (5)(6)(7)(8). Clusters of EV-D68 have also been reported in New York, however none of these cases have exhibited symptoms associated with AFM (45). Other evidence suggests non-polio enterovirus seasonality may be driven by a waning of seroprevalence in the population and the emergence of immunologically naïve newborns (46).
The detection of 13 non-enterovirus species among AFM cases (1) means other known viral aetiologies might also be implicated, including rhinovirus A subtypes and Coxsackie A viruses, however they do not have a history of neurological invasion and AFM. Likewise, symptomatic evidence can exclude flaviviruses previously associated with AFM such as West Nile Virus, which typically present with skin rash, as the cause of this AFM outbreak. Furthermore, diagnostic testing for flaviviruses require blood samples, which if taken, have not been publicly reported. Adenoviruses type 21 (AD21) has also been associated with AFM in the past (16). A large outbreak of Adenovirus has sickened 36 in New Jersey between September and November, which has temporal and spatial associations with current AFM outbreaks also in New Jersey (9,47). Eleven deaths have been implicated in this Adenovirus outbreak, however no cases have developed AFM and the dominant clinical syndrome was reported to be respiratory illness (48). Typing has also shown a mix of adenoviruses types 3 and 7 predominating in the New Jersey outbreak, which do not have a history of causing AFM (48). This might practically eliminate AD21 as a potential cause of the current AFM outbreak.

Key questions
What is the cause of AFM in the US in 2018?
Why is AFM increasing in the US? There were 3 cases diagnosed in the UK between 8 and 26 September 2018. All known close contacts were followed up after their last contact with cases; no further cases were identified and no further transmission of virus was detected (3).

Clinical features
The clinical symptoms are similar to smallpox but less severe. In this outbreak, most of the suspected cases were reported to have rash (4).
Monkeypox is a self-limiting disease with symptoms lasting from two to three weeks. Severity is associated with the infectious dose of exposure and patient health status and is typically worse among children. The incubation period is usually 6 to 16 days.
The infectious period can be divided into two periods(4-6): 1. The invasion period: fever, headache, swelling of lymph node (distinctive feature), back pain, myalgia, fatigue 2. The skin eruption period (within 1-3 days after appearance of fever): various stages of rash, spreading from face to palms and soles of the feet, which are the most affected areas. The rash continues to evolve from maculopapules to vesicles to pustules, and eventually crusts occur in 10 days. It may take three weeks before resolution of crusts.

Mode of transmission (dominant mode and other documented modes)
Zoonotic transmission: transmission may occur from close contact with infected rodents or primates through bites and scratches or consumption of infected and inadequately cooked animal products. Infection by inoculation can occur though contact with cutaneous or mucosal lesion on animals, especially when there are breaches in the skin barrier.
Person to person transmission: Transmission occurs through direct contact with blood, bodily fluids, fluids from cutaneous or mucosal lesions of infected persons or via the respiratory route through infected respiratory tract secretions. Vertical transmission is also described. Congenital monkeypox can occur as the virus is transmitted across the placenta. Similar to smallpox, there is no evidence of pre-symptomatic transmission; transmission occurs while symptomatic during the rash stage. Observational studies in the mid-1980s showed the main infectious period to be during the first week of the rash, similar to smallpox (7).
A study conducted in the US in 2003 examining health care worker exposure to patients with confirmed monkeypox showed that human to human transmission in an outbreak setting is rare but possible (8).
In the 2003 multistate outbreak in the US, two cases were reported to have close contact with lesions and ocular drainage of infected people, however they also had contact with the infected animals (9, 10). Person-to-person transmission could not be excluded in this instance (11,12).
In the 2017 Nigerian outbreak, three family clusters were found, which suggest some level of human-human transmission, however most patients had no obvious epidemiological linkage or human to human transmission (13). Zoonotic source of outbreak is also unclear (13). Human-to-human transmission occurs occasionally from primary cases but very rarely from secondary cases. About 7% of 149 contacts investigated in 2017 Nigeria outbreak was due to human-human transmission (14) while evidence of human-human transmission observed amongst sporadic cases in Nigeria in 2018 has been acknowledged but not specifically reported (15).

Demographics of cases
The index case was a Nigerian resident living at a naval base in Cornwall, UK (1, 2). The patient was suspected to have been infected in Nigeria before travelling to the UK.
No epidemiological link between the first two cases in the UK has been found (16).

Case fatality rate
From previous outbreaks, the CFR has been between 1-10% (18). Two genetic clades of monkeypox virus, the West African Clade and Congo Basic clade, have been defined in the literature. According to available data, the Congo basic clade in more common than the West African clade and is endemic to the DRC (19). The West African Clade is associated with milder disease and fewer deaths and has a CFR <1%, while the Congo Basin clade has CFR up to 11% and previously documented human-human transmission (20).
In September-December 2017, the West African clade was identified in the Nigerian outbreak and, based on NCDC data, had a CFR of 2.9% with 68 confirmed cases from 197 suspected cases across 22 states (14). In 2018, based on NCDC data, the CFR was 2.2% with 45 confirmed cases from 114 suspected cases across 13 states. The same West African clade was reported (15).

Complications
Complications include permanent scarring, disfigurement and death. The prognosis may be worse for patients who are younger, have other comorbidities such as malnutrition, or those who are immunocompromised.

Available prevention
There is no vaccine for monkeypox. Vaccination against smallpox confers cross-protection for monkeypox and has been shown to be 85% effective against monkeypox (21). The vaccine is available as part of the national stockpile in the UK and the government was reported to have ordered more. The vaccine is currently not publicly administered or available in Nigeria (22).

Available treatment
Treatment is supportive and based on the patient's clinical condition. Symptomatic relief is also provided and vaccine-immune globulin can also be used (24).
Tecovirimat is the only approved antiviral treatment by the US Food and Drug Administration and can be a treatment option for monkeypox infection (25,26).

Comparison with past outbreaks
Two previous outbreaks were reported in Nigeria in 1971 and 1978, with two cases and one case respectively amongst individuals who were not vaccinated against smallpox. Cases were linked to consumption of meat obtained from tropical rainforests (28). The outbreak in 1971 involved a four-year-old female index case. The secondary case was her 24-year-old mother. The single case identified in 1978 was a 35 year old man (28).
Since then monkeypox has remained a disease of Central and West African countries, except in 2003 when 37 confirmed and 10 probable cases were reported across six states in the US, the first reported outbreak outside of Africa. Those affected had close contact with pet prairie dogs (rodent of Cynomys species) imported from the endemic region (29). Prior to 2017, the largest outbreak ever reported in Africa was in 1996 in the Democratic Republic of Congo, with more than 70 cases and lasting for one year (30). This was associated with close contact with squirrels and some person to person transmission. The ongoing Nigerian outbreak in 2018 has significantly more cases than previous outbreaks, when probable and confirmed cases are included.
Between the start of the outbreak in September 2017and (2). Preliminary genetic sequencing suggest multiple sources of introduction and no epidemiological linkages across states (14,15).

Unusual features
• The UK outbreak is only the second outbreak outside of West and Central Africa. The first outbreak was the multistate outbreak in the United States of America, but that was due to contact with imported animals. • The source of infection for the 1 st and 2 nd cases is unknown.
• The two cases originating from Nigeria were apparently unrelated.
• Importation of two cases to the UK from Nigeria through travel within days of each other, but unrelated. • From January to September 2018, there were only 76 confirmed and 114 suspected cases of monkeypox in Nigeria, September being the time of the UK outbreak. This makes the probability of two unrelated cases occurring in the UK in people coming from Nigeria (and presumably exposed in Nigeria) fairly low.

Critical analysis
This outbreak started with imported cases in the UK, most likely exposed in Nigeria. There is no epidemiological linkage amongst cases across states in Nigeria, indicating separate source of the two cases in Nigeria. Overall the likelihood of monkeypox spreading in Europe through the UK remains low, because the human to human transmission is not the predominant mode of transmission, but more travel-related cases cannot be excluded as there is an ongoing reporting of sporadic cases in Nigeria and other parts of Central and West Africa since its re-emergence in September 2017 (2). There is a need for a trained workforce on both sides to be aware of symptoms, especially for physicians to be aware of similarities with and differences from varicella and orthopoxvirus infections. The second case was not initially diagnosed, despite there being a prior case of monkeypox diagnosed in the UK, highlighting failures of triage and diagnosis in the clinical setting -in this case resulting in nosocomial infection of a nurse. Heightened surveillance at airports may ensure early detection and efficient control of spread (31). The risk of new cases of monkeypox appearing in the UK depends on the extent of the circulation of the virus in Nigeria and in other countries of West and central Africa (32). Given the current size of the epidemic, it is a low risk. It would also be advisable for health care professionals and public health officials to be aware of the outbreak situation in the Western and Central African region, especially Nigeria.
The risk of transmission person to person depends on the nature and duration of contact. Generally, monkeypox has a low person to person transmission potential, with six being the highest number of suspected consecutive transmission events recorded and varied incubation periods, depending on differences in nature and duration of exposure (33). The household attack rate from previous studies ranged from 3% to 11% and higher among unvaccinated persons and up to 6 intrafamily transmission events have been documented (21,34,35). The median household attack rate was as high as 50% in one study, which could have been due to underestimating the total number of human monkeypox cases and bias introduced as interviews were done by households (36). No close contacts of the three UK cases were reported to have monkeypox in this current outbreak, but the third case was a nurse who became infected while caring for a patient who was the second imported case.  (17). Once a HCID such as monkeypox has been confirmed by appropriate laboratory testing, cases in the UK are transferred to designated HCID treatment centre as soon as possible and highly probable cases are also moved to the treatment centres (23). Once diagnosis was confirmed, the cases in this UK outbreak were transferred to and managed by hospitals that were designated to provide support and one of two principal HCID treatment centres respectively (23). Effective communication, accurate risk assessment and implementation of response activities mitigated the outbreak and proved that the current contingency planning efforts are appropriate (17,23). However, the cases are too few to make any comment about the robustness of the system. Cases could have been missed and the strain may have been introduced into the UK animal population, although this is unlikely. Most of the available data on monkeypox comes from individual case outbreak reports and passive ad-hoc surveillance. It is uncertain on how well this reflects the actual epidemiology of monkeypox. Data obtained from published studies (32,37) The number of monkeypox cases has been increasing in the last two decades in more countries beyond the endemic area of Democratic Republic of Congo and West Africa (32)(37) (Figure 1). As most of the current contemporary population has not been previously vaccinated against smallpox, there is possibly a large susceptible population to monkeypox virus infection in Nigeria and the UK. A possible explanation for the rise in monkeypox cases could be waning immunity due to smallpox vaccine cessation since the 1980's. The highest number of cases in the Nigerian September 2017 outbreak was seen in the 21-30 age group, those born between 1987 and 1996, coinciding with being born after smallpox vaccine cessation (38). As most of the current population has not been previously vaccinated against smallpox, and more with contraindication against the vaccine, there is possibly a large susceptible population to monkeypox virus infection in Nigeria and the UK. Crossprotective effect of smallpox vaccine is being further explored through ongoing studies which also show effectiveness of third and fourth generation smallpox vaccines against monkeypox (39,40). The vaccine against smallpox in humans has been shown to be 85% effective against monkeypox (21) but issues with vaccine safety and relatively small numbers of cases have not justified vaccination. Further, monkeypox is occurring in areas which struggle to maintain adequate vaccine coverage levels for routine vaccinations against measles and polio, which remain a higher priority (41). The global increase in monkeypox should be a cause of concern.

Disease or outbreak
Listeriosis, which is caused by the bacteria Listeria monocytogenes serotype 4b Sequence type ST240 (1).

Origin (country, city, region)
A national outbreak of listeriosis is linked to the consumption of rockmelon (cantaloupe) originating from New South Wales and affecting other states of Australia, including Victoria, Queensland and Tasmania (2).

Suspected Source (specify food source, zoonotic or human origin or other)
Consumption of rockmelons (cantaloupe) from a single grower in New South Wales, the largest State of Australia (3). The grower's farm is named Rombola Family farms and is based at Nericon near Griffith in regional New South Wales (4).

Date of outbreak beginning
Between 17 January and 9 February 2018, 10 elderly people fell sick and were diagnosed with listeriosis (4). The Australian National Focal Point (NFP) notified World Health Organization (WHO) of the listeriosis outbreak on 2 March 2018 (3).

Date outbreak declared over
Ongoing cases were documented until 27 th July; no cases were reported after that. However, no official announcement has been made regarding the end of the outbreak.

Affected countries & regions
• Australia: New South Wales, Victoria, Queensland, Tasmania (5) • Singapore: 2 cases reported to be genetically linked to the Australian outbreak strain (5). The farm which was identified as the source for the listeria outbreak is a major supplier of rockmelons in Australia and exports to at least nine countries, including Singapore (6).

Number of cases (specify at what date if ongoing)
Up to 27 th July 2018, there have been 22 confirmed cases, which comprises of 6 cases in NSW, 8 cases in VIC, 7 cases in QLD and 1 case in TAS (7).

Clinical features
Listeriosis is a life-threatening infection caused by consuming food contaminated with the bacterium L. monocytogenes (8). This disease primarily affects pregnant women and their newborns, older adults, and persons with weakened immune systems (8). The incubation period usually varies from 3 days to 70 days (8).
General Symptoms • Fever • Muscle aches • Sometimes diarrhoea and other gastrointestinal symptoms (8) These symptoms might vary in different cases. In severe cases or cases of invasive listeriosis, patients might develop septicaemia and/or meningitis (9).
In Pregnant women: Infection might cause miscarriage, stillbirth or premature delivery (8).
Other risk groups (Older people and immunocompromised individuals): Additionally, these people might suffer from headache, stiff neck, confusion, loss of balance, and convulsions (8).

Mode of transmission (dominant mode and other documented modes)
• Listeria is mostly contracted through eating contaminated food containing L. monocytogenes bacteria. Babies can be born with listeriosis if their mothers eat contaminated food during the pregnancy (9).
• Listeria does not spread from person to person. However, it is commonly found in the environment (soil) and some foods such as raw meat, unpasteurized milk, soft cheeses, deli meats, raw fruits and vegetables (10).

Demographics of cases
Case demographics as of 27 July 2018 (7

Complications
• In some cases, listeriosis can spread outside the intestines and cause a more advanced form of disease, called invasive listeriosis (11). • Complications include bacterial meningitis, endocarditis, and septicaemia (11). • In pregnant women, it might affect the unborn baby (foetus) and can lead to miscarriage or still birth (11).

Available prevention
For the general population • Avoid potentially contaminated food, especially for individuals in high risk groups (12) • Thorough washing of raw fruits and vegetables is recommended (13). • Raw food from animal sources such as poultry,beef and pork should be cooked properly (13). • Knives, cutting boards and hands should be washed promptly after handling uncooked foods (13). • Consumption of unpasteurised milk should be avoided (13).

Additional precautionary measures for individuals at high risk
• Pregnant women and immunocompromised people should avoid cold deli meats, soft cheeses, pre-packaged salads and chilled raw sea food (13).
• They are advised to eat properly cooked and pasteurised dairy products (13).
There is no evidence of acquired immunity and there is no vaccine to prevent listeriosis (12).
There are various preventive strategies followed by the Food Standards Australia New Zealand (FSANZ) to develop national standards for food processing controls and close monitoring is carried out (12).

Available treatment
• Treatment involves antibiotics and supportive therapy (9). The physician prescribes the antibiotic treatment as per the Australian Therapeutic Guidelines -Antibiotic (12). • The standard antibiotic therapy is for 14-21 days and following are the list of medicines for Listeria infection (14) : (14) -Oral amoxicillin / ampicillin (2-3 g / day) Severe infection (14) -Intravenous amoxicillin / ampicillin (4-6 g / day) -Intravenous gentamicin for 14 days -If patient is allergic to ampicillin, trimethoprim 160 mg/ with sulfamethoxazole 800mg, oral or intravenous depending on the severity of condition (should not be used in the first trimester of pregnancy), is the generally recommended alternative.

Mild infection
• The treatment and management protocol is almost the same for both pregnant women and people at elevated risk of invasive listeriosis, apart from the fact that foetal surveillance is carried out in case of pregnant women (14).

Comparison with past outbreaks
This outbreak is compared below to similar outbreak of listeriosis in Australia in the years 2003, 2009, 2010 & 2014. A comparison is also done with the recent outbreak of listeriosis in South Africa in 2017.
L. monocytogenes exposure in contaminated foods is common in Australia. However, invasive listeriosis is an uncommon disease. From the years 2011-2015 in Australia, the five year mean was 78 cases per year, with a notification rate of 0.3 per 100,000 population (12).

2018
• 22 cases and 7 deaths notified to the NNDSS (7) • All L. monocytogenes positives were further identified as the outbreak WGS strain belonging to MLST240 (7).

Critical analysis
During recent years in Australia, the incidence of listeria infection is constant or has slightly declined owing to the collective efforts of the food industry and the government through implementation of standard food safety and hygiene protocols and improvement of the integrity of the cold chain (12). However, there are some shortcomings which have been identified through investigations.

Investigations
The rockmelons from the Rombola Family Farms (RFF) were sampled by the NSW DPI (NSW Department of Primary Industries) and the whole melons as well as a composite sponge swab tested positive for L. monocytogenes (7). There were also some other peripheral issues noticed in the packing unit of the farms, which included dirty fans (used to reduce the moisture content of melons after washing) and some unclean spongy materials used for packing (7).
Investigations carried out in the rockmelon outbreak indicate that adverse weather conditions (heavy rainfall in December prior to harvest, followed by dust storms) are likely to have significantly increased the organic load and amount of L. monocytogenes present on rockmelons prior to harvest (7).

Implications
Food borne outbreaks due to L. monocytogenes that result in product recalls pose an economic burden for a country from the individual and societal perspective (21). Moreover, additional costs are incurred for investigations, ongoing prevention and control activities (21).

Comparative analysis of Australia vs South Africa outbreak
A timely epidemiological and environmental investigation was conducted which resulted in early detection of the outbreak source and early recall, which limited the number of cases in Australia (3). Despite the high number of cases in South Africa, there has been a tremendous delay in the actions taken by the health authorities and the government to trace the source of L. monocytogenes outbreak (22,23). On the other hand, Australia's response in rapid identification of the food source, prompt exchange of detailed export information, and genetic sequences through the INFOSAN network helped reduce the public health and trade impact of the outbreak (24).

An Unfortunate Need to Revisit Smallpox Preparedness
Michael T. Osterholm 1

Center for Infectious Disease Research and Policy, University of Minnesota, Minneapolis, USA
Often hailed as the single greatest accomplishment in modern health practice, the eradication of smallpox in 1980 ended its historic scourge of death and severe disease. (1): During the 39 years since smallpox eradication, for most of the world its historic human devastation has become but a distant memory. A concerted effort by the World Health Organization ensured that variola major, the virus responsible for smallpox, was no longer in existence in clinical research laboratories around the world. Today we believe that the virus is located in only two repository, high-security biosafety level 4 laboratories-in the United States and Russia.
The likelihood that the virus remains in yet unidentified laboratories throughout the world is, at best, a theoretical possibility. For this reason, despite a call by some to maintain preparedness for the possible return of smallpox, such preparedness remains a low priority for almost all countries in the world.
More recently, however, concern about the potential release of smallpox into the public grew when Canadian scientists synthesized a closely related orthopoxvirus, horsepox, using technology and resources that would be available to a number of laboratories throughout the world. This work remains controversial and raises serious ethical and scientific questions about whether it should even be done. Regardless, it is a stark reminder that someone with a similar capability as the Canadian scientists could synthesize variola major and, with the release of that virus into the public, begin the catastrophic return of smallpox. The world's population is now immunologically naïve to smallpox owing to the end of smallpox vaccination programs more than 40 years ago and waning vaccine immunity in older adults.
We must recognize the challenge that this situation presents to society. Most bioterrorism preparedness experts agree that the likelihood that smallpox could return and cause a devastating regional or even global outbreak remains very low. Nonetheless, because of the potential consequences of such an outbreak, I believe that society has no choice but to have at least a moderate level of preparedness, in both the availability of effective vaccines and in strategies for responding to outbreaks quickly and effectively.
It is for this reason that the inaugural papers in this new online journal, Global Biosecurity, deserve the attention of the world. Two of the papers are a result of a smallpox virus outbreak simulation exercise carried out by C. Raina MacIntyre, MBBS, PhD, and her colleagues. The purpose of the exercise was to review preparedness for a smallpox bioterrorism attack in the Asian Pacific region and globally. The results of the exercise are sobering. The authors have provided us with substantial food for thought with regard to the speed at which a comprehensive outbreak response could be launched in one or more locations around the world and how the very limited supplies of our current smallpox vaccines will be most effectively deployed. While this work focuses on the Asian Pacific region, the results and lessons learned should be considered by every country in the world.
The two related papers in this issue also deserve close reading: "The current and future landscape of smallpox vaccines," by J. Michael Lane MD, and "Effectiveness of three key antiviral drugs used to treat orthopoxvirus infections: a systematic review" by Yu and Raj. First, a comprehensive and thoughtful review of the current and future landscape of smallpox vaccines by Lane serves as a primer on our current smallpox vaccine capability and potential for developing safer and possibly more effective smallpox vaccines for the future. As Lane points out, however, given the general sense by most countries that smallpox will not again be a serious public health challenge, it's unlikely that we will see much investment in developing and stockpiling these fourth-generation vaccines any time soon.
The systematic review of the effectiveness of three key antiviral drugs used to treat orthopoxvirus infections should also serve as a primer for biopreparedness experts considering use of such drugs in humans and their effect on disease progression. Not covered in this important paper-but nonetheless a critical factor in the potential use of these drugs in a smallpox outbreak-is the number of doses currently available in our government stockpiles.
How will those drugs be deployed in countries where a smallpox outbreak might emerge? There are still many challenges considering how the limited supplies of both vaccine and drugs would be distributed. An intentional attack launched simultaneously in multiple global regions would complicate the response still further.
I welcome this new publication and congratulate the authors for focusing on an often not-thought-about but nonetheless potentially catastrophic issue-the return of smallpox. Anyone considering such preparedness today should use these four papers to prepare for what could, and unfortunately one day might, happen.  Dr. Herrera arrived in Spain from his native Cuba in the 1990s, a time in which the adaptation of legislation of epidemiological surveillance to the new territorial model of the State was coming to an end, and when new technologies in the transmission and use of information were being incorporated, along with new methods for the analysis of epidemiological information and changes in the epidemiological pattern. This new context proved the need to improve the training of epidemiological practice, which led the health authorities to propose a training program, based on the EIS model, through a collaboration between the Carlos III Health Institute (ISCIII) and the United States CDC. The National Centre for Epidemiology and the National School of Health, both part of the ISCIII, were responsible for its development and management. The program, which was called Applied Field Epidemiology (PEAC), began in 1994 (1). Dr John Rullán was the program's first director , at the proposal of the CDC. Among the students belonging to the first class was Dr Dionisio Herrera Guibert, incorporated through a collaboration between the Spanish and Cuban health authorities.
Some of the above-mentioned changes did not only occur in Spain. Between 1980 and 2000, in several countries of the European Union, groups of epidemiologists and health professionals began discussion on the role of epidemiology in public health services and their orientation towards the resolution of situations of alert and rapid response, such as outbreaks, health alerts and evaluation of intervention measures. France was a pioneer country when it started an Intervention Epidemiology course in 1984, whose responsibility fell to the Institut pour le Développement de l'Épidémiologie (IDEA) in 1985 (2). As of 1992, the European Union initiated several community public health actions, among them the creation of the Network Committee for the Epidemiological Surveillance and Control of Communicable Disease in 1998, with the purpose of harmonizing the actions of the member states. Another of these actions, which was considered a priority, was the creation of the Epidemiological Intervention European Training (EPIET) in 1995, with participation from 15 EU countries plus Norway (3). EPIET was the adaptation of the EIS to the European reality and was promoted by former EIS trainees. Other countries, such as Spain (1994), Germany (1996)  However, the concern for the development of epidemiological practice in public health services was transformed into a general concern at the end of the last century. The WHO, the CDC, and the ECDC, among other national and international organizations, were actively involved in the promotion and development of the training of epidemiologists in the

Introduction
One vision of future conflict involves the use of medical, pharmaceutical and force protection technologies that enhance, support or improve physiological resilience, combat capabilities and characteristics of a soldier, allowing them to survive and function effectively despite significant Chemical, Biological or Radiological and Nuclear (CBRN) exposures that would normally cause serious injury or death. Enhancements to physiological and physical characteristics of personnel are risk controls commonly employed to improve health resilience, combat survivability and increase command assurance that personnel are best prepared to undertake operations (1). Simple examples of enhancement in use in training exercises and military operations today are physical conditioning, acclimatisation, malaria prophylaxis and travel vaccination (1-3). As military offensive technology has developed over centuries there has been a matched development of defensive and medical technologies driven by military necessity -a medical technology arms race frequently rendering offensive and defensive technologies obsolete. Medical technologies that allow for enhancement of individual physiological and physical resilience have emerged repeatedly since antiquity and continue to be the prime focus of military medical research. Classic examples of such wartime medical innovations are the French introduction of the field military surgeon and triage systems in the Napoleonic wars to improve combat trauma outcomes, the English use of lemon juice for naval and other forces to prevent 1 Enhancement is defined here as any improvement to a soldier or soldier system that assists or provides advantage with completing the mission, and is generally temporary or reversible in nature and of minimal harm to the recipient.
Vitamin C deficiency (scurvy), the development of penicillin in the Second World War to prevent infection, and the development of tetanus vaccine to prevent complications of contaminated traumatic wounds (1).
Injured and ill soldiers on the battlefield, enemy or ally, significantly constrain military operations. For example, the logistic burden of casualty management has been extensively documented in all major theatres of war, famously in North-Eastern France in the First World War (4), but also as recently as the first and second Gulf Wars. Similarly, the impact of influenza on the effectiveness and combat power on the German forces on the Western Front in the First World War is a good example of the operational impact caused by infectious disease (5).
Recent technical and scientific developments have opened the door to the enhancement to the innate defences of personnel and reducing the risk of direct harm when operating in CBRN environments. In this article two recent significant CBRN medical countermeasure (MCM) developments will be outlined, followed by a discussion of the benefits and pitfalls associated with the apparent steady technological progression from simple enhancement 1 of innate human characteristics, towards a future of augmentation 2 of physical and physiological characteristics over and above natural capacity and ability -made possible through advances in bionics, robotics, nanotechnology, genetics, miniaturisation, and wireless networking and communications.
containing anticholinesterase compounds as a normal part of their diet (17).
It has been observed that certain individuals have higher than normal activity of BuChE naturally. This is believed to be due to genetic variation in the BuChE gene encoding the enzyme which results in circulating BuChE having a significantly increased level of general detoxification capacity. These individuals are naturally resistant to high levels of many anticholinergic chemicals found in foods and some pharmaceuticals. Animals which similar mutations have equally been shown to be able to survive nerve agent challenges that are universally lethal to non-mutated animal (14,16,(18)(19)(20)(21), up to 6 times the LD50 without any symptoms. Human populations possessing similar mutations are thought to be naturally resistant to a wide range of chemical insults, including organophosphates and, almost certainly, nerve agents.
These findings led to the hypothesis that BuChE might be developed as an effective nerve agent MCM that would confer a similar protective effect against nerve agent exposure in the average human -effectively conferring a super-detoxification ability to the recipient. This hypothesis is supported by extensive research and clinical trials undertaken to develop a operationalised MCM (14,16,18,19,(21)(22)(23)(24)(25)(26)(27)(28)(29)(30)(31). The human BuChE gene has also recently been incorporated into a bacterial host, for the purpose of large scale production of therapeutic BuChE. It is envisaged that BuChE for field use is administered as a single dose intravenous infusion providing enhanced detoxification coverage for at least 3 days and likely up to 7, after which time the additional BuChE is degraded and the recipients blood profile returns to pre-injection norms (22). One future possibility, now enabled by precision gene editing technologies such as CRISPR-Cas9 and similar constructs, is that additional BuChE genes may be incorporated into the genome of individuals. These genetically modified individuals would be then possess superior detoxification capacity to both nerve agents, organophosphate compounds and many other chemicals and toxicants but with the advantage of not requiring exogenous administration.

Enhancing innate resilience to high dose radiation exposure
Today there exist a number of available technologies that reliably and safely increase the ability of individuals to withstand, without short term negative effect, high dose whole body exposure to radiation. Such exposure could occur in the context of the widespread dissemination of radioisotopes during various military and humanitarian assistance operations, but equally could occur in precision strikes against high profile individuals or capabilities. In either case, high dose radiation exposure is a possible, but invisible and difficult to defend against threat. Acute radiation syndrome occurs in most individuals after they are Heslop DJ. Beyond traditional CBRN force protection -a future of CBRN hardened super-soldiers? Global Biosecurity, 2019; 1(1). pathological effects of radiation exposure at around 9 Gray, masked and not clinically apparent due to mortality at lower exposure levels, are now clinically observable and amenable to investigation and treatment (32,33).

Implications of the CBRN hardened soldier
Technologies such as bionics, electronics, robotics, genetics, nanotechnology and wireless communications have opened new research opportunities for the development of enhancements. Most enhancements fielded until today have provided improvements in performance, generally to a predefined minimum standard of health or performance, that are reversible and do not result in permanent change to the individual. Recent technological developments now allow for researchers, decision makers and organisations to cross ill-defined boundaries between fully reversible and short term enhancement into new territory of augmentation that may have permanent or unpredictable short and long term effects on individuals. Such technologies have the potential to permanently alter, without predictable outcome, the human genetic pool (1, 2, 43, 44) without considering or account for future impacts. The two examples shown above highlight that there is impetus to move towards more intensive augmentation and enhancement of military forces, and that there has been little if any meaningful engagement with the societies that will have to accept the consequences of these efforts.
Examples of recent enhancement and augmentation research can be divided into a number of categories. Augmentation of physical capabilities -strength, mobility, protection and soldier-machine integration (local or remotely) -have been extensively explored including investigations into exoskeletons, liquid and dynamic armour. Augmentation of cognitive capabilities -awareness, attention, memory, planning, learning, language and communication -have been explored with efforts to develop real time language translation for tactical use, situation awareness augmentation and automated inference of commanders' intent during planning. Augmentation of human senses -sight, smell, hearing, touch and taste -have equally been extensively investigated including dual use technologies such as haptic feedback, electronic tongues, electronic pass through hearing protection, and telescoping contact lenses. Augmentation of human metabolismphysiological endurance, metabolism of foods, sleep requirements, and overall wellbeing -are the subject of detailed ongoing investigation with attempts to abolish the requirement for sleep, reduce dependence on regular food intake and increase stamina and endurance in high stress environments (1,3,43).
At the leading edge of research, civil-military cooperative research programs have investigated some military enhancement and augmentation concepts that have only until recently been conceived in fiction. These have included investigations into metabolic modification to enable suspended animation like states and hibernation, near immediate altitude and hypoxia acclimatisation, mind-machine interface for better control of robotic prostheses following amputation or enhanced operational command control and planning, nanotherapeutics and autonomous diagnostics and therapeutics (i.e. self healing) (1,3,43).
Such augmentation raises serious questions, not only ethical questions but questions of leadership, legality, conduct of military operations, individual psychological impacts and trauma and intergenerational impacts. For instance, could augmented soldiers be considered "weapons" in themselves and therefore subject to regulation under Laws of Armed Conflict (LOAC) or be defined as "biological weapons" when considering the provisions of the Biological and Toxin Weapons Convention? Will augmented personnel work harmoniously with non-augmented personnel, or be a segregated and hostile subpopulation? If augmentation cannot be reversed, what are the longer-term impacts for the personnel? Will they be able to reintegrate successfully into civilian life at the end of their service life? Are military personnel -themselves in a hierarchical structure and open to coercion and bias of many types -able to give valid enough informed consent for augmentation, particularly where it might have whole of life effects, or effects in progeny across generations (such as arises from genetic modification)? What are the parameters for defining "acceptable risk" for the purpose of implementing augmentation across a military population? (1) Some of these questions have been the subject of intense debate in philosophical and bioethical circles over the past decades. A recent report has provided a useful framework for exploring these ethical issues, and highlights that ongoing military research and development into enhancement and augmentation is generating policy vacuums that are becoming wider as technological advances continue (1). They raise concerns that enhancement and augmentation research is accelerating with technological innovation and that much of the work is occurring without sufficient ethical or legal consideration. This, they say, is generating risk uncertainties that may ultimately result in strategic surprise and negative societal outcomes (1)(2)(3)44).
Balanced against this, however, is that near-peer military rivals are sizing up mutual war-fighter enhancement and augmentation efforts. The new opportunities emerging due to technological innovations such as augmentation and enhancement create new opportunities for great power competition and "arms races". Western militaries may be experiencing a classical military operational and ethical dilemmawhat may be ethically unpalatable may be nonetheless militarily necessary -failure to invest in enhancement and augmentation research and development may result in strategic and potentially decisive operational disadvantages with significant real-world ramifications.
There may come a point where CBRN force protection enhancement and augmentation becomes sufficiently sophisticated that many traditional CBRN agents, and the systems designed to defend against them, become Heslop DJ. Beyond traditional CBRN force protection -a future of CBRN hardened super-soldiers? Global Biosecurity, 2019; 1(1). obsolete and thus neutralised as threats. Such enhancements, improving on the human body's ability to defend against exposure to an agent or recover more effectively and return to operations, are currently temporary in nature and fully reversible and there are no known long-term outcomes on the individual. Such reversible interventions are seen as more ethically acceptable, and more readily accepted by front line personnel who at the individual level generally desire the maximum level of protection possible. For a military to take the next steps and deliver both CBRN enhancement and augmentation technology, thorny ethical issues surrounding consent, disclosure, duress, undue influence, autonomy, beneficence and non-maleficence at the individual and population level will need to be addressed (1,43).
The question is, where will the arms race of enhancement and augmentation end? Lin, Mehlman and Abney parallel the recent highly publicised technological advances and research to develop battlefield robots able to deal with the complexities of the modern battlespace, with intensive but less well publicised human augmentation research to develop more augmented and engineered humans able to manage a wider range of physiological and physical insults (1). They summarise this neatly, suggesting that on the one hand we are attempting to make machines more human and on the other are attempting to make humans more machine. The ethical debate surrounding the benefits and risks of this general research direction is intense and ongoing (2,3,44,45). These are not new arguments -an important historical example touching on these ethical concerns being the philosophical arguments of Nietzsche in the late 19 th century espousing the virtues of the "Übermensch" class (45) and the subsequent counterarguments to Nietzsche's propositions by Santayana (46). In conclusion, accelerating technological innovation and new research methodologies are opening enhancement and augmentation possibilities previously only dreamt of, or found in fiction. Traditional ethical and moral standards are subsequently being challenged and stressed, and organisational and social policy and understanding are lagging. While enhancement offers apparently ethically acceptable solutions to high risk problems, such as CBRN force protection, future developments are likely to touch on the boundaries of acceptability and risk -a challenge in a strategic landscape where miscalculation can have significant and long term negative outcomes.

Introduction
He Jiankui's recently announced germ line modification of two embryos which were carried to term and born has raised significant concerns about scientific misconduct, lack of ethical oversight, and the scientific merit of the methods used in the experiment (1). However, significant concerns over the broader ramifications to human genetic diversity and unintended or unexpected outcomes have also been raised by the announcement by He of the first modified human embryos carried to term and resulting in live births (2). Since 2016, several researchers published work on editing of human embryos, but the research had not resulted in live births (3). Whilst the ethics of human germline editing has been debated (4), and mechanisms for governance proposed (5), there are broader future ramifications to human society and survival, which must be explored.
Since 2013, gene editing has been greatly enabled by CRISPR Cas9 (a relatively new method for conducting precision editing of genetic information), which has broad application across all of life sciences, including agriculture, food production and medical therapies (6). The benefits to treatment and prevention of human disease are substantial (7), including the potential to cure monogenic diseases and alter the clinical impact of polygenic diseases (8). Clinical trials are currently underway to edit human cells for the treatment of cancer (9). Other studies of human germline editing for prevention of hereditary diseases have also been conducted (3). It should be noted that alternatives to editing of human genes are also available for the treatment of genetic diseases, such as gene silencing (10). The first gene silencing drug to get US FDA approval in 2018 was patisiran, which causes RNA interference to block the harmful effects of hereditary transthyretin amyloidosis on cardiac and nerve function (11). The purpose of He's work, however, was neither treatment of existing disease nor prevention of genetic disease, but design of a human being to be resistant to a potential health threat which they may or may not ever be exposed to (10). The allegedly engineered infants were born to HIV positive fathers, but vertical transmission of HIV is from mother (not father) to infant, so even the purported justification is flawed. He's university (from where he was on unpaid leave to pursue his private enterprise) and the Chinese government have denounced the experiment, so it appears the work was conducted in the private sector without appropriate ethical oversight (12). This highlights how easy it is to conduct such work outside of the regulated academic sector.

An emerging genetic warfare arms race
The potential advantages and drawbacks of germ woperations has not yet been rigorously explored. Nevertheless, genome editing opens the door to the deliberate conception and selective modification of

Targets for germ line modification against enemy forces
The potential for this technology to also be used for harm against enemies must not be neglected. It is possible through this technology to insert "sleeper" mechanisms within the genome of a target population, activated through exposure to otherwise innocuous events and causing deleterious effects. This might include subtle modifications of populations for resilience to environmental stressors, decreased resistance to infection or disturbances to immunity, and cognitive or behavioural effects. Equally, this could include development of novel characteristics in organisms or humans, or exploitation of characteristics of both materiel and unmodified personnel, to achieve military aims (34). In the strategic domain, ecological weaknesses could be introduced to a populationcaused by gradual reductions in genetic diversity or introduction of lower performance genes, into a population leading to changed ability to adapt to environmental change, stress, changed fertility and survival -risks already identified in human and nonhuman modification alike (35). The primary concern is the almost limitless ability of actors to interfere with others using methods and techniques difficult to identify, prove or counter. Given that human ambition, greed and competition remains strongly conserved in the population, it would appear inevitable that most of the negative scenarios possible with genomic editing are likely, or have already, begun to play out.

Implications
The recent calls for a moratorium (35,36) highlight the significant concern in the scientific, security and wider community that germ line editing has stimulated. However, we are already in the post-editing era where not only are the practical techniques for achieving germ line modification now clear, but new research horizons will open. In all domains of societycommercial, military, social -this technology opens up areas of individual and population competitions and tensions that have been recognised as at best destabilising, and at worst likely to result in mass suffering and destruction that is unpredictable. Furthermore, CRISPR Cas9 technology still has problems, with unintended DNA changes that can lead to unexpected consequences including cancer and other diseases (36).
In the military domain, the logic of strategic balance of power dictates likely emergence of genetic warfare arms race, with some form of involvement of all major powers (37). This situation further complicates an already sensitive global balance of power, introducing uncertainties into strategic calculus with possible severe negative implications. He Jiankui's work could be seen as the "Trinity Test" of an era of Genetic Warfare, and not only of germ line modification. However, the implications of this may be more profound, far reaching, and impactful than the recent nuclear escalations for humanity.

CBRN News
Waxing CBRNE, Waning humanity Chemical, Biological, Radiological, Nuclear and Explosive (CBRNE) concerns across the globe have waxed and waned over recent decades, as they have done stretching back into antiquity. The relentless march of technological innovation, discovery and emergence is tightly coupled to the potential use or mis-use of that knowledge for the benefit or harm to humankind. These past few years has seen a sudden proliferation of old and new CBRNE capabilities used for nefarious purpose: the willingness of actors in the Middle East to develop and use traditional chemical agents -chlorine, mustard, and others -to generate terror and inflict terrible injuries on non-combatants and combatants alike, the murder of Kim Jong Nam in Kuala Lumpur with the potent nerve agent VX in 2017, and the murders and attempted murders in Salisbury, England with the novel Novichok agents in 2018. These, however, are just the most prominent examples of a multitude of CBRNE agent use, development, accident and capability changes that have occurred over that same period. CBRNE is a small component of a wider and accelerating competition between rivals. Gone are the days of a bipolar or unipolar world order that seemed to be a more tractable strategic problem. The implications of these momentous geopolitical changes are significant and affect every part of our current and future lives.
In addition to CBRNE development, much effort is being expended to develop news forms of offensive weaponry to deliver or disseminate CBRNE agents. Autonomous and remotely piloted aircraft or undersea craft, which incorporate various types of Artificial Intelligence capabilities, are now able to deliver payloads -including CBRNE offensive payloads -to locations without human intervention and through complex and changing environments, even at hypersonic speeds. The ballistic missile technologies of the Cold War may be soon redundant, as far greater flexibility and reliability can be achieved with either airborne or seagoing autonomous delivery options.
Given that much attention is focussed on the importance of radiological, nuclear and chemical weapons proliferation, attention on biological weapons proliferation and potential harm to humanity is often relegated to a less prominent position. However, recent technological innovations and discoveries in the biological arena are creating risks to humanity significantly greater than those currently posed by radiological, chemical and even nuclear weapons. The recent development and use of CRISPR-Cas9 to edit the germ line of two embryos subsequently born has significant near-term and long-term implications for humanity. Widespread commercialisation of gene editing technologies could easily be an existential risk to the diversity, integrity and sustainability of the human genome for future generations. Given the long timeframes involved in measuring or observing outcomes and limited abilities for prognosticating risk, our lack of understanding of the genomic risks of modification renders unregulated germ-line engineering at best unjustifiable and unethical, and at worst a major risk to future generations.
One asks if proponents of germ-line engineering, at this early stage of sophistication, would be comfortable for their own children to procreate with altered individuals? What are the rights of such children? Are the ethical, leadership and practical obligations of the scientific community to the wider corpus of humanity being met? These are just a few of many important questions that now confront us. In sum, we do indeed live in interesting and rapidly evolving times. Given the risks emerging in a changing world, the role of the enlightened and ethical researcher has never been greater, and the need to highlight and explore the risks of CBRNE proliferation and impact so acute. I ask researchers interested in CBRNE impacts and implications to consider exploring the implications of the recent developments and proliferation, ask deep and probing research questions, and communicate evidence and analysis through forums such as this so that the benefits of CBRNE and related developments can be realised, while avoiding the harms.

About the author
Associate Professor David Heslop is the Director of Health Management at the School of Public Health and Community Medicine, at UNSW Sydney. He retains significant military responsibilities as Senior Medical Adviser for CBRNE to Special Operations Headquarters Australia and to Australian Defence Force (ADF) joint senior leadership. He is also a clinically active vocationally registered General Practitioner, a senior trainee in Occupational and Environmental Medicine with the Royal Australasian College of Physicians, and a fellowship candidate for the Academy of Wilderness Medicine. During a military career of over 12 years he has deployed into a variety of complex combat environments, and has advanced international training in Chemical, Biological, Radiological, Nuclear and Explosive (CBRNE) Medicine. In 2014, he was appointed as Senior Medical Officer for Special Operations Command, and was the Officer Commanding and Senior Medical Officer to the ADF CBRNE medical incident response element at Special Operations Engineer Regiment from 2012-2015. He has direct experience in planning for and management of major disasters, mass casualty and multiple casualty situations. He participates in the development and review of national and international clinical and operational general military and CBRNE policy and doctrine and in complex systems modelling.