Information

Is it possible to artificially induce a 40°C fever in a controlled and safe manner?

Is it possible to artificially induce a 40°C fever in a controlled and safe manner?


We are searching data for your request:

Forums and discussions:
Manuals and reference books:
Data from registers:
Wait the end of the search in all databases.
Upon completion, a link will appear to access the found materials.

In search for any possible new treatments for COVID-19, I am of the understanding that the SARS-nCoV-2 virus may suffer deterioration at high temperatures and high humidity, to perhaps weaken it sufficiently for your immune system to be able to handle it.

Is there any research done on artificially inducing a (let's say) 40°C fever, in a controlled manner, such that brain function will not be endangered?

(Are there any fundamental biological functions that would prevent this possibility?)


Possibly Related Questions:


References:
[1] https://www.cancer.gov/about-cancer/treatment/types/surgery/hyperthermia-fact-sheet
[2] https://en.wikipedia.org/wiki/Hyperthermia_therapy
[3] https://en.wikipedia.org/wiki/Thermoregulation
[4] Janeways Immunobiology (book)


In search for any possible new treatments for COVID-19, I am of the understanding that the SARS-nCoV-2 virus may suffer deterioration at high temperatures and high humidity

If you are referring to the linked question, that is for virus in air on a dry surface. It is a test of how long the virus can survive outside of a host before it is desiccated and destroyed by the environment. It is not relevant to viral particles inside of an organism.

Are there any fundamental biological functions that would prevent this possibility?

There is little evidence that elevated body temperature has much effect on duration of infection with most common viral diseases. This is why antipyretic drugs are commonly recommended for people with viral infections.


Dengue and severe dengue

Dengue is a mosquito-borne viral disease that has rapidly spread in all regions of WHO in recent years. Dengue virus is transmitted by female mosquitoes mainly of the species Aedes aegypti and, to a lesser extent, Ae. albopictus. These mosquitoes are also vectors of chikungunya, yellow fever and Zika viruses. Dengue is widespread throughout the tropics, with local variations in risk influenced by rainfall, temperature, relative humidity and unplanned rapid urbanization.

Dengue causes a wide spectrum of disease. This can range from subclinical disease (people may not know they are even infected) to severe flu-like symptoms in those infected. Although less common, some people develop severe dengue, which can be any number of complications associated with severe bleeding, organ impairment and/or plasma leakage. Severe dengue has a higher risk of death when not managed appropriately. Severe dengue was first recognized in the 1950s during dengue epidemics in the Philippines and Thailand. Today, severe dengue affects most Asian and Latin American countries and has become a leading cause of hospitalization and death among children and adults in these regions.

Dengue is caused by a virus of the Flaviviridae family and there are four distinct, but closely related, serotypes of the virus that cause dengue (DENV-1, DENV-2, DENV-3 and DENV-4). Recovery from infection is believed to provide lifelong immunity against that serotype. However, cross-immunity to the other serotypes after recovery is only partial, and temporary. Subsequent infections (secondary infection) by other serotypes increase the risk of developing severe dengue.

Dengue has distinct epidemiological patterns, associated with the four serotypes of the virus. These can co-circulate within a region, and indeed many countries are hyper-endemic for all four serotypes. Dengue has an alarming impact on both human health and the global and national economies. DENV is frequently transported from one place to another by infected travellers when susceptible vectors are present in these new areas, there is the potential for local transmission to be established.


Zoonotic Potential of Emerging Paramyxoviruses

Patricia A. Thibault , . Benhur Lee , in Advances in Virus Research , 2017

5.1.5 Postentry Essential Host Factors That Are Species-Specific Are Not Yet Known

While both human and bovine parainfluenza virus 3 (BPIV-3) are extremely similar at both an antigenic and genetic level (see Fig. 2 and Table 1 ), BPIV-3 is known to be attenuated for human infection ( Karron et al., 1995 ), while HPIV-3 is decidedly not ( Pecchini et al., 2015 ). Interestingly, simply replacing the N (nucleocapsid) protein of HPIV-3 with that of BPIV-3 attenuates the human virus in macaques ( Bailly et al., 2000 ), as does swapping other BPIV-3/HPIV-3 ORFs ( Skiadopoulos et al., 2003 ), although the converse experiment in cattle has not been performed. Infection of macaques and humans with BPIV-3 results in 1- to 3-log lower viral titers recovered, but viral replication does occur and antibody responses are induced ( Bailly et al., 2000 Schmidt et al., 2000 Skiadopoulos et al., 2003 ), which indicates that the species restriction for BPIV-3/HPIV-3 is not absolute. Since we know that BPIV-3 is competent to enter primate cells ( Schmidt et al., 2000 Skiadopoulos et al., 2003 ) and is able to successfully antagonize the human innate immune response by blocking interferon induction with both C and V accessory proteins ( Komatsu et al., 2007 Yamaguchi et al., 2014 ), the species-specific attenuation of BPIV-3 in humans may instead be explained by an incompatibility with a required host factor during RNA replication or virion production. Although examples of species-specific virus–host interactions outside of entry and immune evasion are limited and not well characterized in paramyxoviruses ( Tao and Ryan, 1996 ), a biologically related example of such can be found in H5N1 avian influenza. A mutation in the PB2 polymerase subunit, E-to-K at amino acid 627, is associated with adaptation to replication, transmission, and pathogenicity in mammalian cells ( Gabriel et al., 2013 ). This mutation has recently been determined to restore interaction of the avian influenza polymerase and the mammalian version of ANP32A, a required host factor with significant differences from the avian protein ( Long et al., 2016 ). Paramyxoviruses are likely to have analogous interactions with host proteins that facilitate important stages of the virus life cycle, but since discovery and characterization of enzootic paramyxoviruses in both their current hosts and potential emergent host systems have received less attention and research energy than pandemic and avian influenza viruses, the significance of species-specific host factor restriction remains to be determined and is a ripe area for exploration. As this area of paramyxovirus biology is examined, we may also find that lack of a required host factor negatively synergizes with other barriers like suboptimal receptor interactions to ultimately prevent successful cross-species infection and/or sustained transmissibility in humans.


Pro Science, Not Anti Testing

There must be times in every copywriter’s life when they wish they could go back and change what they’ve written. Courtesy of Science History Institute (Public Domain).

It is right and proper to question new technologies for potential harm as they emerge, lest they conceal another tragedy such as Thalidomide-related birth defects or leave a toxic legacy such as DDT accumulation in the ecosystem. It is even right to question new developments in the light of emerging scare stories such that surrounding the MMR vaccine and its supposed connection to autism in the 1990s. This is the point of science to always question and push the boundaries of human knowledge.

But this article is not dealing with the evidence-based research. Instead we are up against a much more primeval part of human nature the fear of that which we don’t understand. The same impetus that made some of our ancestors burn suspected witches when their livestock became sick is making them ascribe random headaches or other pieces of bad luck to the appearance (or in the case of those random pieces of street furniture, imagined appearance) of a cell tower. While the concerned citizens will almost certainly all use cellphones, to them they are a magic artifact covered in glowing runes that might as well have been seized from the dust of a hidden tomb as part of the plot of an Indiana Jones movie.

Are we as engineers and technologists in part responsible for this? Have we made the technology so invisible as to be considered witchcraft? The purpose of technology should be to make lives better, and for that to extend to everyone it means you shouldn’t need an engineering background to use it. So yes, we have made it invisible and perhaps were we’ve been lax is in making the basic concepts a part of the hype for cell technology itself. But no matter how good a job is done in educating the end user, to exercise the vernacular of social media: idiots gonna idiot.

Header image, a broadcast radio curtain array: MikeincDerivative work: Chetvorno / CC BY-SA 3.0.


Vaccine design

Vaccine design concerns the selection of antigens, vaccine platforms, and vaccination routes and regimen. The choice of vaccine platform determines the relative immunogenic strength of vaccine-derived viral antigens, whether an immune adjuvant is required and the nature of protective immunity. These attributes also determine the suitability of a vaccine for a particular route of vaccination, and whether a prime–boost vaccination regimen is required to increase vaccine-mediated protective immunity and its durability. Furthermore, the selection of live attenuated viral vaccines or a respiratory mucosal route of vaccination will require more stringent safety testing (Box 2).

Selection of SARS-CoV-2 antigens

The structural proteins present in the infectious virion include S protein, N protein, matrix (M) protein and envelope (E) protein. The N protein coats the large positive-stranded RNA genome, which is encased in a lipid envelope derived from the host cell membrane, into which the other three proteins (S, M and E) are inserted. In the case of SARS-CoV, it has been shown that only antibodies directed to S protein can neutralize the virus and prevent infection 75 . As a result, all SARS-CoV-2 vaccines in development include at least a portion of the S protein. These may be restricted to only the S1 domain or the RBD.

Non-neutralizing antibodies to both S protein and the other exposed proteins (E and M) are generated. As there is a suspected role of these non-neutralizing antibodies, as well as weakly neutralizing antibodies, in ADE of disease, the inclusion of other structural (N) and/or non-structural proteins as vaccine antigens may help to create a more balanced response involving both humoral and T cell-mediated immunity. These could be highly expressed proteins such as N protein or highly conserved functional proteins that have a crucial role in the viral life cycle. For example, inclusion of viral enzymes such as the RNA-dependent RNA polymerase in a vaccine design may ensure that it targets all emerging variant strains, as these proteins are highly conserved 59,76,77 , even across other bat-derived coronaviruses that could emerge as a threat to humans in the future.

Vaccine platforms

In general, vaccine platforms are divided into six categories: live attenuated virus, recombinant viral-vectored vaccines that are bioengineered to express target pathogen antigens in vivo, inactivated or killed virus, protein subunit vaccines, virus-like particles (VLPs) and nucleic acid-based (DNA or mRNA) vaccines. In broad terms, vaccines require two components: antigens from the target pathogen that are provided to or generated by the vaccine recipient and an infection signal (such as a pathogen-associated molecular pattern or damage-associated molecular pattern) that alerts and activates the host immune system. Live attenuated vaccines can naturally provide both of these components, whereas non-viral vaccine platforms can provide the antigens but often require the artificial provision of signals to alert the immune system known as adjuvants. Typically, these non-viral vaccine platforms require multiple vaccinations to induce protective immunity, whereas live virus-based vaccines have the ability to provide ‘one-shot’ immunity. Similarly to non-viral platforms, killed virus vaccines sometimes require the inclusion of an adjuvant and repeated administration for full efficacy 78 . There are immunological pros and cons to each of these technologies as discussed later (Table 1).

Vaccination routes and regimens

In addition to the careful selection of vaccine antigens and platform, the route of vaccination is an integral consideration of vaccine strategies 52,79 . This is particularly important for mucosal pathogens such as SARS-CoV-2 and those pathogens against which optimal protection requires not only neutralizing antibodies but also innate and adaptive cellular immunity 17,80 . The best window of opportunity for SARS-CoV-2 control and clearance is the asymptomatic or presymptomatic period of COVID-19 (2–12 days), which is likely to require all of the immune protective elements to be present within the respiratory mucosa before viral entry 16,17,27 . The route of vaccination has a crucial role in determining this 52,81 . Protective IgG antibodies induced by parenteral vaccination readily appear at the respiratory mucosa, this being the primary mechanism by which intramuscular injection of measles or influenza vaccine offers protection in humans. However, this route of vaccination is unable to effectively induce mucosal IgA antibodies or TRM cells in the lungs 52,81 . By comparison, the respiratory mucosal route of vaccination is adept at inducing antibodies and TRM cells in the respiratory mucosa, as well as macrophage-mediated trained immunity 52,54,80,81,82,83,84,85 (Box 1). Inactivated virus, protein subunit and nucleic acid vaccines cannot be administered by the respiratory mucosal route owing to their requirement for potentially unsafe immune adjuvants and repeated delivery (Table 1). By contrast, recombinant viral-vectored vaccines, particularly those using human serotype 5 adenovirus (Ad5) or chimpanzee-derived adenovirus (ChAd), are safe and highly effective for respiratory mucosal vaccination 79 .

Often, weakly immunogenic vaccines based on inactivated virus, protein subunits, nucleic acids or viral vectors such as Ad26 require a repeated homologous vaccination regimen to be effective. Indeed, most current human vaccines require repeated doses. As it is not yet known which COVID-19 vaccine strategy will be used or for how long the vaccine-induced protection may last in humans, it remains possible that a homologous or heterologous prime–boost vaccination regimen will be required to sustain protection, even with robust stand-alone platforms such as ChAd. The same or a different route may be used for the repeated vaccine delivery.


When it comes to vaccines, suddenly “from vs with” matters again The media’s attitude to possible vaccine-related injuries highlights how INSANE the “Covid deaths” count always was Facebook Twitter Reddit Pinterest WhatsApp vKontakte Email

Unlike the Guardian we are NOT funded by Bill & Melinda Gates, or any other NGO or government. So a few coins in our jar to help us keep going are always appreciated.

Our Bitcoin JTR code is: 1JR1whUa3G24wXpDyqMKpieckMGGW2u2VX

I haven’t seen this in MSM, but I may have missed it…

More bad news…probably 6 month extension to government lockdown powers.

Well done, Mr. Knightly! Beautiful piece. Thanks for writing it and for all that Off-Guardian has done to combat the lie.

One of the best articles that I’ve seen on this scamdemic. It is absolutely correct in its analysis, and it covers all crucial points which really highlight the duplicity and mendacity of this Operation COVAIDS. It could have been strengthened just a little if it pointed out the specific exponential rate of fraud involved in ramping up tales of “cases”, which are created spuriously by mere supposition as soon as someone is identified as “infected” by the tests which are notorious for their false positives (and perhaps ONLY false positives). Then they branch out on assumptions based on contact. They also outright blame symptoms on COVAIDS which could have been from anything else. And then they build the death rate inflation into those inflated numbers (this the writer does describe), so the effective inflation of death rates is even far greater than described in the article. Then multiply the deaths by a Chemical Injection which, along with all the other worsening factors in the “response” cause deaths that are blamed on COVAIDS… they are even talking about infections being detected after the Chemical Injections, but not talking about the cause BEING the Chemical Injections, ON TOP OF having no valid testing procedure for a not-isolated virus that is not shown to cause any disease in the first place. Imagine how 4 and 5G fit into all this on top of it!

lol yes please tell us how 5G fits into all this I can’t wait to hear this

5g is needed for the massive amounts of data transfers when the cashless society is introduced. Along with the multiple monitoring devices constantly checking where you are, who you’ve been with and what your body temperature is.

You seem to be very confused about how the deaths are counted in this country.

The government briefings use deaths within 28 days of a positive test as their metric. You are rightly pointing out that this doesn’t describe whether these people are actually dying because of COVID, or for some other reason but they coincidentally had tested positive e.g. car accident or whatever. Presumably the government use this statistic because is it is available quickly (every day).

HOWEVER, the metric that you should look at and that does take in to account whether someone has died because of COVID (rather than just with COVID) are the statistics from the ONS (office for national statistics). This data uses the MCCD (medical certificate for cause of death). When a patient dies, the doctors who have been looking after the patient whilst they were alive (be that in hospital or in the community) will write up the cause of death. This is based on seeing the patient, the clinical history, their blood tests, other investigations and imaging etc which ultimately culminates in doctors making a decision based on their own best clinical judgement of what actually caused the death. In cases where doctors are unsure what was the cause or the patient died suddenly or they hadnt seen a doctor in their last 4 weeks of life will be referred to the coroner. This process takes time and is presumably why the government use the 󈧠 days measure’ instead.

It is very rare that just one thing will be written on the cause of death e.g. example someone could die of a heart attack, but that heart attack was most likely a result of several underlying risk factors like high blood pressure, high cholesterol, smoking, obesity etc. So on the death certificate they would write cause of death: 1a heart attack, 1b secondary to high blood pressure, high cholesterol etc etc. COVID19 could be written on someones MCCD for several reasons either as the MAIN cause of death or as one of the SECONDARY causes e.g. someone could be admitted to hospital with COVID but then suffer a pulmonary embolus and die as a result of that or they were deteriorating from another condition and they then contracted COVID. Clearly in these cases COVID might not be the main cause of death but it would certainly be considered contributory. Looking at the ONS statistics – you can see the total number is actually HIGHER for the death certificate number vs for those within 28 days. [1] This maybe due to the fact that during the early stages of the first wave, testing wasn’t as readily available, but based on clinical history, examination, blood tests and imaging results doctors can make reasonable assumptions whether somoeone had COVID even if they weren’t able to do the nose and throat swab on them. This is why we have doctors and why medicine isn’t just a simple series of binary outcomes.

Furthermore, looking at the numbers of mentions of COVID as a main cause vs a secondary cause you can see that the majority have COVID as a MAIN cause of death. [2]

That makes sense, however, and correct me if i’m wrong, the official covid death numbers (for example the ones that appear in Google search) are precisely the covid deaths in a timeframe of 28 days.

Yes it looks like it. Like I said this is most likely due to the fact that these can be calculated on a daily basis without the timelag of deaths needing to be registered etc. Please also note that this ‘official’ death number is lower than those reported by the ONS (using death certification) if you look at the first link you sent.

Hi Stevie, many thanks for your response.

I found this document wherein it states

if before death the patient had symptoms typical of COVID19 infection, but the test result has not been received, it would be satisfactory to give ‘COVID-19’ as the cause of death, tick Box B and then share the test result when it becomes available. In the circumstances of there being no swab, it is satisfactory to apply clinical judgement.

Given that a large number (over a third?) of covid deaths have occurred in care homes, not hospitals, would you agree that this is the time for 100% accurate data, not basing it on assumptions just because someone displayed symptoms of covid?

Can you tell me why there are no, or very few, recorded deaths from flu this winter, as reported by Bel Mooney (relaying the comments of a Registrar) in the article I linked to in an earlier post?

We know that a lot of people have had covid and had no problems at all. You missed out the phrase at the beginning of your response ‘death for any reason’ – to me this is crucial as this phrase is also omitted by the newsreader on the BBC News at 6p every night when they relay the latest ‘toll’, despite the word being clearly displayed at the bottom of the screen. Why is it omitted?

I broke my leg in December 2020 and the attending nurse was telling me how quiet it had been at the hospital, particularly last Spring, 2020, when the lockdown began.

The images of hospitals at breaking point reminded me of the winter of 2017/18, when the NHS was at breaking point. I can recall stories about hospital ‘about to run out of oxygen’ during this recent winter. In 2018, there were stories that some hospitals actually had run out of oxygen.

There was a pandemic in 2009. Why was there no lockdown or even a mention of closing airports? In 1968/70 we had the so-called Hong Kong flu, when between 1 million and 4 million died worldwide but there was no shutdown of the economy or society.

For me, this has been media-hyped. People have been scared to death and claims of the lethal nature of the virus simply don’t stack up when you read about the stats for people below the age of 60.

We have destroyed our economy, lost our freedoms, caused unknown numbers of suicides and associated cancer, stroke and heart deaths for a virus we could have dealt with by shielding the frail and elderly. I received my first bowel cancer testing kit when I was 60. I’m supposed to receive the next one every two years. I’m 63 in 2 months, so I’ve missed a year. I was told the NHS doesn’t have the time and resources to test them. How will this effect cancer death stats in years to come?

As usual, the real figures will emerge after an enquiry. Neil Ferguson will be given a knighthood for services to crap computer modelling (and maybe he will actually spend some time studying for a GCSE in Biology) and this 2 year period will be known as the time when scientists were allowed to govern the country due to the weakness of the PM and the government.

We have severe flu seasons – I was never once asked to wear a mask when flu was ripping through care homes and we visited my 90 year old mother (who survived double pneumonia during the 2017/18 winter). She’s still with us, thankfully.

A year ago, I, like everyone else, was disinfecting my post as it came through the door, ditto the shopping, washing my hands and singing happy bloody birthday twice each time. Now – nope. I shan’t be having the vaccine. I shall rely on my immune system – and so to should the younger generation. But that’s up to them and their parents.

Firstly, clinical judgement is applied all the time in medicine. There are no absolute tests in medicine (as with most things in science) which is one of the reasons we have doctors in the first place. You question whether this is enough to write a death certificate and should we not try to be �% accurate’. There is no such thing as 100% accuracy even if you perform an autopsy you would still be making assumptions based on the clinical history, examination, blood tests, imaging etc along with post mortem findings. Death certs are always and have always been based on some degree of assumptions. Feeling generally unwell, febrile, dry cough, silent hypoxia, l ymphopenic, thrombocytopenic , with similarly unwell contacts, low procalcitonin, then gets clinically worse around day 7-10 in the adaptive immunity phase with minimal/ no improvement on antibiotics and subsequently elevated inflammatory markers screams COVID even before PCR test (remembering that the sensitivity for the PCR is around 75% i.e. it will MISS 1/4 cases of COVID depending on when in the illness it was taken). Doctors writing death certs by law have to have seen the patient in their last 28 days of life hence have an idea of the history, likely have examined them and even in the care home have access to bloods, bacterial cultures and other investigations etc. If you think that making healthcare decisions based on the above pieces of information is insufficient you might be frightened to know that this is how we conduct 99% of medicine, with most of it actually based on the history. I would be interested to know what level of investigation you feel would be adequate.

Do I think SOME of the COVID carehome deaths have been misdiagnosed? Sure. But is it on such a grand, conspiratorial scale and would it have ultimately impacted wider public decision making? No. With the sheer numbers of patients who demonstrably had COVID in the hospitals, along with the excess deaths makes it obvious that the same situation was occuring in the carehomes as well.

Like I alluded to in my original post – I think the PHE ‘death within 28 days of a +ve swab for any reason’ is a poor metric and you should look at the ONS statistics for the most reliable indicator of COVID deaths, which incidentally shows the PHE numbers under represents the number of COVID deaths. I can only assume the govt use this metric as it is available more readily without having to wait for deaths to be registered etc but the communication about why this is used overall has been more and creates a question of legitimacy in an already skeptical public.

With regards to hospitals being ‘quiet’ this is very variable, depending on the hospital and the member of staff. As the lockdown took effect in the first wave there were very few people coming to hospital. A lot of the reasons I personally heard, particularly from the elderly who had sat on their problems were ‘I didn’t want to bother anyone at this busy time’ or ‘I was scared of getting COVID’ or even ‘I thought the hospital was closed’. NHS trusts were tweeting people to remind them to come hospital if they were unwell! In addition to all of that, there were none of the usual drunks coming in on saturday nights to A+E, less traffic accidents and elective surgeries had been cancelled. So for short period it was very quiet and COVID was really only in London. But as time went on the numbers starting increasing massively and I have never seen anything like it in my years of being a doctor. Areas that were previously for outpatients were converted into entire wards and they were FULL of COVID patients, many were relatively young and without significant comorbidity. There were certainly times were oxygen pressures were running low. Doctors were being ‘redeployed’ from other areas e.g. clinical geneticists or plastic surgeons who had relatively little work began working in ITU. The conversion of HDU and theatre recovery into functioning ITU spaces. The NHS just about keeps it together on a good day due to chronic underfunding but throw a pandemic in there and you begin to really stretch it. It was this massive effort from ALL hospitals that ultimately managed to accomodate for this huge influx of patients. Had we had this on top of regular scheduled service we would have been unable to cope without a doubt. In many ways, if you work outside of ITU COVID is actually quite a simple disease. You can give oxygen and steroids and if that doesn’t work there’s not really much else you can do other than call ITU. So it made the job relatively simple for those who worked outside of ITU, not having to deal with a whole ward of patients all of which had different, more complex issues. So you can see how SOME people might say it was ‘quiet’ but others have a different view. Ultimately you can’t draw meaningful conclusions based on ‘My mate Barry works in the hospital and he said XYZ’ etc it really depends on the cirumstances.

There isn’t ‘no influenza’ deaths. You can see the influenza death stats on the ONS website, lower this year likely due to masks, social distancing and the fact that more people died earlier in the year.

I’m not really here to debate the pros and cons of lockdown, there are certainly good points on either side. I’m not a policy maker or politician. My main concern was with the misinformation in the article. But I don’t think simply shielding the elderly would have been succesful given the virulence of this disease and I think the NHS would have very much struggled to cope with the sheer numbers had there been no enforced measure to attempt to reduce transmission.


Dengue and severe dengue

Dengue is a mosquito-borne viral infection causing a severe flu-like illness and, sometimes causing a potentially lethal complication called severe dengue. The incidence of dengue has increased 30-fold over the last 50 years. Up to 50-100 million infections are now estimated to occur annually in over 100 endemic countries, putting almost half of the world’s population at risk.

Dengue is a vector-borne disease transmitted by the bite of an infected mosquito. There are 4 serotypes of the virus that causes dengue. These are known as DEN-1, DEN-2, DEN-3, DEN-4.

Severe dengue is a potentially lethal complication which can develop from dengue infections.

It is estimated that there are over 50-100 million cases of dengue worldwide each year and 3 billion people living in dengue endemic countries.

Dengue is mainly transmitted by a mosquito (Aedes aegypti) and is distributed across all tropical countries. Ae. aegypti and other species such as Ae. albopictus are highly adaptive and their combined distribution can spread dengue higher up north across Europe or North America during summer. (Note: Travellers already infected with the virus also spread the disease when they get bitten by the local Aedes mosquito population).

Dengue outbreaks can occur anytime, as long as the mosquitoes are still active. However, in general, high humidity and temperature are conditions that favour mosquito survival, increasing the likelihood of transmission.

Dengue fever

Dengue causes flu-like symptoms and lasts for 2-7 days. Dengue fever usually occurs after an incubation period of 4-10 days after the bite of the infected mosquito.

High Fever (40°C/ 104°F) is usually accompanied by at least two of the following symptoms:

  • Headaches
  • Pain behind eyes
  • Nausea, vomiting
  • Swollen glands
  • Joint, bone or muscle pains
  • Rash
Severe dengue

When developing into severe dengue, the critical phase takes place around 3-7 days after the first sign of illness. Temperature will decrease this does NOT mean the person is necessarily recovering. On the other hand, special attention needs to be given to these warning signs as it could lead to severe dengue:

  • Severe abdominal pain
  • Persistent vomiting
  • Bleeding gums
  • Vomiting blood
  • Rapid breathing
  • Fatigue/ restlessness

When severe dengue is suspected, the person should be rushed to the emergency room or to the closest health care provider as it causes:

  • Plasma leaking that may lead to shock and/or fluid accumulation with/without respiratory distress
  • Severe bleeding
  • Severe organ impairment.

There is no vaccine or specific medication for dengue fever.

Patients should seek medical advice, rest and drink plenty of fluids. Paracetamol can be taken to bring down fever and reduce joint pains. However, aspirin or ibuprofen should not be taken since they can increase the risk of bleeding.

Patients who are already infected with the dengue virus can transmit the infection via Aedes mosquitoes after the first symptoms appear (during 4-5 days maximum 12). As a precautionary approach, patients can adopt measures to reduce transmission by sleeping under a treated net especially during the period of illness with fever.

Infection with one strain will provide life-time protection only against that particular strain. However, it is still possible to become infected by other strains and develop into severe dengue.

When warning signs of severe dengue are present (listed above), it is imperative to consult a doctor and seek hospitalization to manage the disease.

With proper medical care and early recognition, case-fatality rates are below 1%. However, the overall experience remains very discomforting and unpleasant.

If you suspect you have dengue you need to see a doctor immediately. To diagnose dengue fever, your doctor will:

  • Evaluate your signs and symptoms
  • Test your blood for evidence of a dengue virus
  • Review your medical and travel history.

Persons who had travelled to dengue endemic countries during the past two weeks should inform the doctor about it.

Dengue is spread through the bite of the female mosquito (Aedes aegypti). The mosquito becomes infected when it takes the blood of a person infected with the virus. After about one week, the mosquito can then transmit the virus while biting a healthy person. The mosquito can fly up to 400 meters looking for water-filled containers to lay their eggs but usually remains close to the human habitation.

Aedes aegypti is a daytime feeder: The peak biting periods are early in the morning and in the evening before dusk.

Dengue cannot be spread directly from person to person. However, a person infected and suffering from dengue fever can infect other mosquitoes. Humans are known to carry the infection from one country to another or from one area to another during the stage when the virus circulates and reproduces in the blood system.

Aedes aegypti has evolved into an intermittent biter and prefers to bite more than one person during the feeding period. This mechanism has made Aedes aegypti a very highly efficient epidemic vector mosquito.

The mosquitoes thrive in areas close to human population (urban areas).

The dengue mosquito lays its eggs in water-filled containers inside the house and surrounding areas of dwellings (this includes non-used bottles, containers, discarded waste, tyres etc&hellip which hold water).

The eggs hatch when in contact with water. Eggs can withstand very dry conditions and survive for months. Female mosquitoes lay dozens of eggs up to 5 times during their lifetime.

Adult mosquitoes &ldquousually&rdquo rest indoors in dark areas (closets, under beds, behind curtains). Here it is protected from wind, rain and most predators, which increases its life expectancy and the probability that it will live long enough to pick up a virus from one person and pass it on to the next.

The best preventive measure for areas infested with Aedes mosquito is to eliminate the mosquitoes&rsquo egg laying sites &ndash called source reduction. Lowering the number of eggs, larvae and pupae will reduce the number of emerging adult mosquitoes and the transmission of the disease. Examples of the following habitats are listed:

  • Indoor
    • Ant traps
    • Flower vases and saucers
    • Water storage tank (domestic drinking water, bathroom, etc&hellip)
    • Plastic containers
    • Bottles
    • Discarded bottles and tins
    • Discarded tyres
    • Artificial containers
    • Tree holes, potholes, construction sites
    • Drums for collecting rainwater
    • Shells, husks, pods from trees
    • Leaf axils of various plants
    • Boats, equipment

    Items that collect rainwater or are used to store water should be covered or properly discarded. The remaining essential containers should be emptied and cleaned and scrubbed (to remove eggs) at least once a week. This will avoid the adult mosquitoes to emerge from the egg/ larva/ pupa stage.

    In fact, the community participation is the key to dengue prevention. As every household aims to reduce vector density, the transmission rate will decrease or maybe even stop.

    Protecting yourself from mosquito bites is most effective by reducing exposed skin to mosquitoes to bite on. Long-sleeved clothing and mosquito repellents (containing DEET, IR3535 or Icaridin) are the most viable options.

    Window and door screens, air conditioning reduces the risk of mosquitoes coming into contact with the household members. Mosquito nets (and/or insecticide-treated nets) will also provide additional protection to people sleeping during the day, or protect against other mosquitoes which can bite at night (such as malaria). Household insecticides aerosols, mosquito coils or other insecticide vaporizers maybe also reduce biting activity.


    VIII. Judicial Decisions / Prominent Cases

    A landmark case of significance in the early development of the US biotechnology industry was the US Supreme Court&rsquos 1980 decision in Diamond v. Chakrabarty,[95] holding that genetically engineered microorganisms can be patented. This decision &ldquocontributed to a revolution in biotechnology that has resulted in the issuance of thousands of patents, the formation of hundreds of new companies, and the development of thousands of bioengineered plants and food products.&rdquo[96]

    Outside of patent law, however, the role of the US courts in shaping regulatory policy with respect to GMOs has been limited. A common theme among many court decisions on GMOs has been the judiciary&rsquos deference to agency expertise in determining how to regulate them.

    The Supreme Court&rsquos decision in Monsanto Co. v. Geertson Seed Farms[97] involved a challenge under NEPA to APHIS&rsquos decision to issue a determination of nonregulated status to Monsanto&rsquos Roundup Ready Alfalfa (RRA), a genetically engineered variety of alfalfa, after making a &ldquofinding of no significant impact&rdquo determination instead of preparing an environmental impact statement (EIS). The district court ruled that an EIS was required, and as a remedy enjoined APHIS from deregulating RRA, in whole or in part, pending completion of the EIS, and also enjoining almost all planting of RRA nationwide.[98] The Supreme Court reversed, ruling that the district court exceeded its authority in enjoining APHIS from partially deregulating RRA and enjoining the planting of RRA. It concluded that APHIS should be allowed to structure a partial deregulation order while completing the EIS.[99]

    In Alliance for Bio-Integrity v. Shalala,[100] the plaintiffs challenged the FDA&rsquos 1992 policy statement that GMO foods that are similar to conventional varieties would be presumptively deemed &ldquogenerally recognized as safe&rdquo (GRAS)[101] and that they need not be labeled.[102] The district court declined to rule that the FDA&rsquos decision that genetic modification does not &ldquomaterially&rdquo alter foods and to presume GMO foods are GRAS was arbitrary and capricious, stating that &ldquothe rational for [court] deference [to agency decision making] is particularly strong when the [agency] is evaluating scientific data within its technical expertise.&rdquo[103] As to labeling, it said that given FDA&rsquos decision on the GRAS issue, it would also find that the FDA&rsquos interpretation of the FFDCA&rsquos labeling requirement was reasonable.[104]

    Other GMO cases have similarly displayed the tendency of US courts to defer to agency decision making.[105]

    Luis Acosta
    Senior Legal Information Analyst
    March 2014

    [1] Pew Initiative on Food and Biotechnology, Guide to U.S. Regulation of Genetically Modified Food and Agricultural Biotechnology Products 6 (Sept. 2001), http://www.pewtrusts.org/uploadedFiles/www pewtrustsorg/Reports/Food_and_Biotechnology/hhs_biotech_0901.pdf.

    [2] The Biotechnology Industry in the United States, Select USA, http://selectusa.commerce.gov/print/industry-snapshots/biotechnology-industry-united-states (last visited Nov. 5, 2013) (stating there are 1300 firms and 1.3 million employees in biosciences in the US, and 5.8 million employees in related industry sectors).

    [3] Int&rsquol Serv. for the Acquisition of Agri-Biotech Applications, ISAAA Brief No. 44-2012, Global Status of Commercial Biotech/GM Crops, Executive Summary, http://www.isaaa.org/resources/publications/briefs/44/ executivesummary/default.asp (last visited Nov. 5, 2013).

    [4] Economic Research Service, Adoption of Genetically Engineered Crops in the US, Recent Trends in GE Adoption, United States Department of Agriculture, http://www.ers.usda.gov/data-products/adoption-of-genetically-engineered-crops-in-the-us/recent-trends-in-ge-adoption.aspx#.UobvBXL92Dk (July 9, 2013).

    [5] Cartagena Protocol on Biosafety to the Convention on Biological Diversity, Jan. 29, 2000, 39 I.L.M. 1027, http://bch.cbd.int/protocol/text/ Parties to the Protocol and Signature and Ratification of the Supplementary Protocol, Convention on Biological Diversity, http://bch.cbd.int/protocol/parties/ (last updated Sept. 10, 2012).

    [6] Bureau of Oceans and International Environmental and Scientific Affairs, Frequently Asked Questions on the Cartagena Protocol on Biosafety (CPB), U.S. Department of State (Feb. 23, 2004), http://2001-2009.state.gov/g/ oes/rls/or/2004/29751.htm.

    [7] See, e.g., id. (noting participation as nonparty in first Meeting of the Parties to the Protocol).

    [8] Memorandum from the Mellman Group to the Pew Initiative on Food and Biotechnology, Review of Public Opinion Research 1 (Nov. 16, 2006), http://www.pewtrusts.org/uploadedFiles/wwwpewtrustsorg/Public_Opinion/ Food_and_Biotechnology/2006summary.pdf.

    [10] Allison Kopicki, Strong Support for Labeling Modified Foods, N.Y. Times (July 27, 2013), http://www.nytimes. com/2013/07/28/science/strong-support-for-labeling-modified-foods.html see also U.S. Polls on GE Food Labeling, Center for Food Safety, http://www.centerforfoodsafety.org/issues/976/ge-food-labeling/us-polls-on-ge-food-labeling# (last visited Nov. 12, 2013)(citing multiple polls showing support for mandatory labeling ranging from 93 to 96% percent).

    [11] Kopicki, Strong Support for Labeling Modified Foods, supra note 10.

    [12] Committee on Identifying and Assessing Unintended Effects of Genetically Engineered Foods on Human Health, National Research Council, Safety of Genetically Engineered Foods: Approaches to Assessing Unintended Health Effects 180 (2004), http://www.nap.edu/openbook.php?isbn=0309092094 (&ldquo[N]o adverse health effects attributed to genetic engineering have been documented in the human population.&rdquo) Committee on Environmental Impacts Associated with Commercialization of Transgenic Plants, Board on Agriculture and Natural Resources, National Research Council, Environmental Effects of Transgenic Plants: The Scope and Adequacy of Regulation 49 (2002), http://www.nap.edu/openbook. php?isbn=0309082633 (&ldquoThe transgenic process presents no new categories of risk compared to conventional methods of crop improvement, but specific traits introduced by both approaches can pose unique risks.&rdquo)

    [13] AAAS Issues Statement on Labeling of Genetically Modified Foods, California Council on Science and Technology (Nov. 1, 2012), http://www.ccst.us/news/2012/1101aaas.php. (&ldquo[C]rop improvement by the modern molecular techniques of biotechnology is safe. . . . [C]onsuming foods containing ingredients derived from GM crops is no riskier than consuming the same foods containing ingredients from crop plants modified by conventional plant improvement techniques.&rdquo)

    [14] American Medical Association, Policy No. H-480.958, Bioengineered (Genetically Engineered) Crops and Foods, http://www. ama-assn.org/resources/doc/PolicyFinder/policyfiles/HnE/H-480.958.HTM (last visited Nov. 10, 2013) (reaffirming prior conclusion that &ldquono evidence that unique hazards exist either in the use of rDNA techniques or in the movement of genes between unrelated organisms&rdquo).

    [15] See, e.g., Support Sustainable Agriculture, Greenpeace USA, http://www.greenpeace.org/usa/en/campaigns/ genetic-engineering/ (last visited Nov. 12, 2013) Genetic Engineering, Sierra Club, http://www.sierraclub. org/biotech/ (last visited Nov. 12, 2013).

    [16] See, e.g., Policy, Organic Seed Growers & Trade Association, http://www.osgata.org/policy/ (last visited Nov. 12, 2013).

    [17] See, e.g., Genetic Engineering and Biotechnology, Organic Consumers Association, http://www.organic consumers. org/gelink.cfm (last visited Nov. 12, 2013) GE Foods, Center for Food Safety, http://www.center forfoodsafety.org/issues/311/ge-foods (last visited Nov. 12, 2013).

    [18] See, e.g., Gregory N. Mandel, Toward Rational Regulation of Genetically Modified Food, 4 Santa Clara J. Int&rsquol L. 21 (2006), http://digitalcommons.law.scu.edu/cgi/viewcontent.cgi?article=1015&context=scujil Maria R. Lee-Muramoto, Reforming the &ldquoUncoordinated&rdquo Framework for the Regulation of Biotechnology, 17 Drake J. Agric. L. 311 (2013), available at http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2175622 Debra M. Strauss, Defying Nature: The Ethical Implications of Genetically Modified Plants, 3 J. Food L. & Pol&rsquoy 1 (2007), available at http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1302506 Sheryl Lawrence, What Would You Do With a Fluorescent Green Pig?: How Novel Transgenic Products Reveal Flaws in the Foundational Assumptions for the Regulation of Biotechnology, 34 Ecology L.Q. 201 (2007), http://www.boalt.org/elq/documents/elq34-1-05-lawrence-2007-0430.pdf.

    [19] Coordinated Framework, 51 Fed. Reg. 23,302 (June 26, 1986), available in manuscript format at http://www. aphis.usda.gov/brs/fedregister/coordinated_framework.pdf.

    [20] Id. at 3 (PDF manuscript pagination).

    [21] OSTP, Proposal for a Coordinated Framework for Regulation of Biotechnology, 49 Fed. Reg. 50,856 (Dec. 31, 1984).


    On a November day in 1721, a small bomb was hurled through the window of a local Boston Reverend named Cotton Mather. Attached to the explosive, which fortunately did not detonate, was the message: “Cotton Mather, you dog, dam you! I’ll inoculate you with this with a pox to you.’’ This was not a religiously motivated act of terrorism, but a violent response to Reverend Mather’s active promotion of smallpox inoculation. The smallpox epidemic that struck Boston in 1721 was one of the most deadly of the century in colonial America, but was also the catalyst for the first major application of preventative inoculation in the colonies. The use of inoculation laid the foundation for the modern techniques of infectious diseases prevention, and the contentious public debate that accompanied the introduction of this poorly understood medical technology has surprising similarities to contemporary misunderstandings over vaccination.

    The Boston Epidemic

    For over a year, from the spring of 1721 until winter 1722, a smallpox epidemic afflicted the city of Boston. Out of a population of 11,000, over 6000 cases were reported with 850 dying from the disease. Of a series of seven epidemics in the region during the 1700s, this was the most deadly [2]. Though tragic, the 1721 epidemic led to a major milestone in the history of vaccination and smallpox eradication. The use of inoculation during this epidemic, and the heated debate that arose surrounding the practice, was one of the first major applications of inoculations in western society, paving the way for Edward Jenner to develop smallpox vaccination by the end of the century.

    The Disease and Early Inoculation

    Smallpox is an ancient disease caused by the Variola virus. This virus exists in two main forms: Variola major, which historically has a mortality rate of around 30%, and the less severe Variola minor with a mortality rate around 1% [3]. Variola major is predominantly transmitted either by direct or indirect contact with the respiratory droplets from an infected individual [4]. The natural pathogenesis of Variola major begins with the infection of the mucous membrane of the upper respiratory system, then invasion of the bloodstream, and eventually the skin, producing the classical presentation of smallpox pustules and signifying that the patient has become infectious. Death can result from toxins in the blood, blood clots, and septic shock [5].

    Inoculation against smallpox is believed to have been practiced in China as far back as 1000 BC, and is reported to have been common in India, Africa, and Turkey prior to its introduction into western societies in the 18 th century [1]. In China the practice was to blow dried and ground Variola scabs into the nostrils of the patient. In Turkey, however, the technique of inoculation involved inducing a less serious form of the smallpox disease by exposing an incision to the Variola pus [6]. The latter is the procedure that was eventually brought to England and colonial America. The idea was based on the basic observation that those who survived smallpox, moderate or severe, were significantly less likely to contract the disease again. By deliberately inducing an acute smallpox infection through a small localized wound, a healthy person was more likely to survive the infection than if they had acquired the disease naturally through aerosolized viral particles. Smallpox vaccination, as developed by Edward Jenner in the late 1700s, worked on the same principle but differed in that the viral source was the less dangerous cowpox disease (Table 1) [7]. Today, smallpox vaccination uses the Vaccinia virus to induce immunity, and the principle of vaccination has been applied to battling numerous other infectious diseases.

    Table 1 The primary difference between the methods of inoculation and vaccination, which both generate an immunity against smallpox, was in the viral source. Inoculation used actual smallpox material, while vaccination used immunologically-related cowpox, and now Vaccinia virus.

    Introducing Inoculation to the West

    Although inoculation was already common in certain parts of the world by the early 18 th century, it was only just beginning to be discussed in England and colonial America. Cotton Mather is largely credited with introducing inoculation to the colonies and doing a great deal to promote the use of this method as standard for smallpox prevention during the 1721 epidemic. Mather is believed to have first learned about inoculation from his West African slave Onesimus, writing, “he told me that he had undergone the operation which had given something of the smallpox and would forever preserve him from it, adding that was often used in West Africa.’’ After confirming this account with other West African slaves and reading of similar methods being performed in Turkey, Mather became an avid proponent of inoculation [8]. When the 1721 smallpox epidemic struck Boston, Mather took the opportunity to campaign for the systematic application of inoculation. What followed was a fierce public debate, but also one of the first widespread and well-documented uses of inoculation to combat such an epidemic in the West.

    The Outbreak in Boston

    On April 22, 1721, a British ship arrived in Boston Harbor. On board, one of the sailors had begun to exhibit symptoms of smallpox. He was quickly quarantined, but several more members of the crew soon fell ill with the disease. An outbreak of the disease spread quickly through the city [1]. As the epidemic worsened, Cotton Mather reached out to the medical community of Boston, imploring them to use the inoculation method. One physician, Zabdiel Boylston, heeded his call, but most other doctors were hostile to the idea. At the forefront of the anti-inoculation contingency was one of Boston’s only physicians who actually held a medical degree, Dr. William Douglass. The arguments against inoculation were varied, ranging from disagreement on religious grounds to scientific uncertainty. While many argued that inoculation violated divine law, by either inflicting harm on innocent people or by attempting to counter God’s specific will, the main argument that Douglass made was that inoculation was untested and seemingly based on folklore. Douglas feared that unchecked use of inoculation would only quicken the spread of disease throughout the city [8].

    By modern standards, this argument seems highly sensible. The use of a poorly researched medical technique, particularly one as potentially hazardous as intentionally exposing healthy people – including children – to smallpox, would be highly unethical today. To many professional Boston physicians, inoculation must have appeared as unscientific as other contemporary treatments such as bleeding and purging, which were still common practice during the early 18 th century.

    But as the epidemic began to diminish in early 1722, Mather and Boylston had collected surprisingly thorough data that made a clear argument for the effectiveness of inoculation (Figure 1). Boylston, who had personally inoculated some 287 people, recorded that of those inoculated only 2% had died. In comparison, the mortality rate of the naturally occurring disease during that year was 14.8% [1].

    Figure 1 As inoculation became increasingly common practice in Boston during the 18 th century, the incidence of smallpox fatalities steadily decreased.

    Although inoculations were themselves a risky practice and carried a not-insignificant health risk, this data demonstrates that inoculations were significantly less fatal than the naturally occurring virus. Ultimately, this helped to disprove the opposition’s fear that such a technique would only facilitate the spread of disease. Mather and Boylston’s advocacy and observations resulted in what was actually one of the earliest clinical trials on record, and the use of both experimental and control groups to demonstrate the effectiveness of inoculation significantly aided the adoption of the practice [1,9].

    Smallpox continued to be a significant health threat throughout the 18 th and 19 th centuries, and part of the 20 th , but the introduction and success of inoculation in the early 1700s, followed later by the much safer vaccination method developed by Edward Jenner, steadily reduced the threat the disease posed until its eradication in 1980 (Figure 2) [10].

    Figure 2 When the much safer practice of vaccination became the predominant method to combat smallpox at the end of the 18 th century, annual deaths from the disease were reduced to only a fraction of what they were less than a hundred years prior.

    Then and Now

    The debate over the use of inoculation, particularly apparent during the 1721 epidemic in Boston, still bears relevance today. Modern vaccination campaigns, most notably targeting the eradication of polio, continue to face violent opposition in many parts of the world where the disease is still present, particularly in Pakistan, Afghanistan, and Nigeria [11]. Just this past November four polio vaccination workers were killed in Pakistan [12]. Even in the United States, outbreaks among groups of unvaccinated individuals have risen in the past decade, a trend that is often attributed to the spread of misinformation regarding the potential risks, contents, and mechanism of vaccination [13,14]. The story of the 1721 Boston Smallpox epidemic, and the controversy that accompanied the introduction of inoculation by Dr. Boylston and Cotton Mather, exemplifies how opposition to inoculation and then vaccination has been present for as long as the practices themselves. Although there is still a great deal of work to be done in the fight against infectious diseases and enclaves of opposition remain, the effectiveness and benefit of vaccination has been clearly demonstrated over many decades of systemic application that began with the work of Mather and Boylston [15].

    Matthew Niederhuber is a Research Assistant in the Silver Lab in Department of Systems Biology at Harvard Medical School.

    References

    [1] Best, M. Neuhauser, D. and Slavin, L. “Cotton Mather, you dog, dam you! I’l inoculate you with this with a pox to you”: smallpox inoculation, Boston, 1721. Quality and Safety in Health Care 2004.13:82-83.

    [2] Henry E. H. Experience in Massachusetts and a Few Other Places with Smallpox and Vaccination. Boston Medical Surgical Journal 1921. 185:221-228.

    [3] Centers for Disease Control and Prevention: “Smallpox Disease Overview” http://emergency.cdc.gov/agent/smallpox/overview/disease-facts.asp

    [4] World Health Organization: “Frequently asked questions and answers on smallpox” http://www.who.int/csr/disease/smallpox/faq/en/

    [5] Behbehani, Abbas. The Smallpox Story: Life and Death of an Old Disease. Microbiological Reviews 1983. 47.4:455-509.

    [6] U.S. National Library of Medicine: “Smallpox: A Great and Terrible Scourge” http://www.nlm.nih.gov/exhibition/smallpox/sp_variolation.html

    [7] Riedel, S. Edward Jenner and the history of smallpox and vaccination. Baylor University Medical Center Proceedings 2005. 18.1:21-25.

    [8] Buhr, S. To Inoculate or Not to Inoculate?: The Debate and the Smallpox Epidemic of 1721. Constructing the Past 2000. 1:61-67.

    [9] Boylston, Z. An Historical Account of the Small-pox Inoculated in New England, Upon All Sorts of Persons, Whites, Blacks, and of All Ages and Constitutions: With Some Account of the Nature of the Infection in the Natural and Inoculated Way, and Their Different Effects on Human Bodies: with Some Short Directions to the Unexperienced in this Method of Practice / Humbly Dedicated to Her Royal Highness the Princess of Wales. Boston: 1730.

    [11] Bhutta, Z. Infectious disease: Polio eradication hinges on child health in Pakistan. Nature 2014. 511:285-287.

    [12] Yousafza G. “Militants kill four polio workers in Pakistan.” Reuters 26 Nov. 2014.

    [13] Omer, S. et al. Vaccine Refusal, Mandatory Immunization, and the Risks of Vaccine-Preventable Diseases. New England Journal of Medicine 2009. 360:1981-1988.

    [14] Wang, E. et al. Nonmedical Exemptions From School Immunization Requirements: A Systematic Review. American Journal of Public Health 2014. 11:62-84.


    Process Control

    Process control is a statistical and engineering discipline that deals with the design and mechanisms for maintaining the output of a specific process within a desired range. These activities are involved in ensuring a process is predictable, stable, and consistently operating at the target level of performance with only normal variation. Process control enables mass production of continuous process as well a level of automation by which a small staff may operate a complex process from a central control room.

    All operations in the receiving, inspecting, transporting, segregating, preparing, manufacturing, packaging, and storing of food shall be conducted in accordance with adequate sanitation principles. Appropriate quality control operations shall be employed to ensure that food is suitable for human consumption and that food-packaging materials are safe and suitable. Overall sanitation of the plant shall be under the supervision of one or more competent individuals assigned responsibility for this function. All reasonable precautions shall be taken to ensure that production procedures do not contribute contamination from any source. Chemical, microbial, or extraneous-material testing procedures shall be used where necessary to identify sanitation failures or possible food contamination. All food that has become contaminated to the extent that it is adulterated within the meaning of the act shall be rejected, or if permissible, treated or processed to eliminate the contamination.

    Processes and Controls

    1. Raw materials and other ingredients shall be inspected and segregated or otherwise handled as necessary to ascertain that they are clean and suitable for processing into food and shall be stored under conditions that will protect against contamination and minimize deterioration. Raw materials shall be washed or cleaned as necessary to remove soil or other contamination. Water used for washing, rinsing, or conveying food shall be safe and of adequate sanitary quality. Water may be reused for washing, rinsing, or conveying food if it does not increase the level of contamination of the food. Containers and carriers of raw materials should be inspected on receipt to ensure that their condition has not contributed to the contamination or deterioration of food.
    2. Raw materials and other ingredients shall either not contain levels of microorganisms that may produce food poisoning or other disease in humans, or they shall be pasteurized or otherwise treated during manufacturing operations so that they no longer contain levels that would cause the product to be adulterated within the meaning of the act. Compliance with this requirement may be verified by any effective means, including purchasing raw materials and other ingredients under a supplier's guarantee or certification.
    3. Raw materials and other ingredients susceptible to contamination with aflatoxin or other natural toxins shall comply with current Food and Drug Administration regulations and action levels for poisonous or deleterious substances before these materials or ingredients are incorporated into finished food. Compliance with this requirement may be accomplished by purchasing raw materials and other ingredients under a supplier's guarantee or certification, or may be verified by analyzing these materials and ingredients for aflatoxins and other natural toxins.
    4. Raw materials, other ingredients, and rework susceptible to contamination with pests, undesirable microorganisms, or extraneous material shall comply with applicable Food and Drug Administration regulations and defect action levels for natural or unavoidable defects if a manufacturer wishes to use the materials in manufacturing food. Compliance with this requirement may be verified by any effective means, including purchasing the materials under a supplier's guarantee or certification, or examination of these materials for contamination.
    5. Raw materials, other ingredients, and rework shall be held in bulk, or in containers designed and constructed so as to protect against contamination and shall be held at such temperature and relative humidity and in such a manner as to prevent the food from becoming adulterated within the meaning of the act. Material scheduled for rework shall be identified as such.
    6. Frozen raw materials and other ingredients shall be kept frozen. If thawing is required prior to use, it shall be done in a manner that prevents the raw materials and other ingredients from becoming adulterated within the meaning of the act.
    7. Liquid or dry raw materials and other ingredients received and stored in bulk form shall be held in a manner that protects against contamination.
    1. Equipment and utensils and finished food containers shall be maintained in an acceptable condition through appropriate cleaning and sanitizing, as necessary. Insofar as necessary, equipment shall be taken apart for thorough cleaning.
    2. All food manufacturing, including packaging and storage, shall be conducted under such conditions and controls as are necessary to minimize the potential for the growth of microorganisms, or for the contamination of food. One way to comply with this requirement is careful monitoring of physical factors such as time, temperature, humidity, aw, pH, pressure, flow rate, and manufacturing operations such as freezing, dehydration, heat processing, acidification, and refrigeration to ensure that mechanical breakdowns, time delays, temperature fluctuations, and other factors do not contribute to the decomposition or contamination of food.
    3. Food that can support the rapid growth of undesirable microorganisms, particularly those of public health significance, shall be held in a manner that prevents the food from becoming adulterated within the meaning of the act. Compliance with this requirement may be accomplished by any effective means, including:
      • (i) Maintaining refrigerated foods at 45 °F (7.2 °C) or below as appropriate for the particular food involved.
      • (ii) Maintaining frozen foods in a frozen state
      • (iii) Maintaining hot foods at 140 °F (60 °C) or above
      • (iv) Heat treating acid or acidified foods to destroy mesophilic microorganisms when those foods are to be held in hermetically sealed containers at ambient temperatures.
    4. Measures such as sterilizing, irradiating, pasteurizing, freezing, refrigerating, controlling pH or controlling aw that are taken to destroy or prevent the growth of undesirable microorganisms, particularly those of public health significance, shall be adequate under the conditions of manufacture, handling, and distribution to prevent food from being adulterated within the meaning of the act.
    5. Work-in-process shall be handled in a manner that protects against contamination.
    6. Effective measures shall be taken to protect finished food from contamination by raw materials, other ingredients, or refuse. When raw materials, other ingredients, or refuse are unprotected, they shall not be handled simultaneously in a receiving, loading, or shipping area if that handling could result in contaminated food. Food transported by conveyor shall be protected against contamination as necessary.
    7. Equipment, containers, and utensils used to convey, hold, or store raw materials, work-in-process, rework, or food shall be constructed, handled, and maintained during manufacturing or storage in a manner that protects against contamination.
    8. Effective measures shall be taken to protect against the inclusion of metal or other extraneous material in food. Compliance with this requirement may be accomplished by using sieves, traps, magnets, electronic metal detectors, or other suitable effective means.
    9. Food, raw materials, and other ingredients that are adulterated within the meaning of the act shall be disposed of in a manner that protects against the contamination of other food. If the adulterated food is capable of being reconditioned, it shall be reconditioned using a method that has been proven to be effective or it shall be reexamined and found not to be adulterated within the meaning of the act before being incorporated into other food.
    10. Mechanical manufacturing steps such as washing, peeling, trimming, cutting, sorting and inspecting, mashing, dewatering, cooling, shredding, extruding, drying, whipping, defatting, and forming shall be performed so as to protect food against contamination. Compliance with this requirement may be accomplished by providing adequate physical protection of food from contaminants that may drip, drain, or be drawn into the food. Protection may be provided by adequate cleaning and sanitizing of all food-contact surfaces, and by using time and temperature controls at and between each manufacturing step.
    11. Heat blanching, when required in the preparation of food, should be effected by heating the food to the required temperature, holding it at this temperature for the required time, and then either rapidly cooling the food or passing it to subsequent manufacturing without delay. Thermophilic growth and contamination in blanchers should be minimized by the use of adequate operating temperatures and by periodic cleaning. Where the blanched food is washed prior to filling, water used shall be safe and of adequate sanitary quality.
    12. Batters, breading, sauces, gravies, dressings, and other similar preparations shall be treated or maintained in such a manner that they are protected against contamination. Compliance with this requirement may be accomplished by any effective means, including one or more of the following:
      • (i) Using ingredients free of contamination.
      • (ii) Employing adequate heat processes where applicable.
      • (iii) Using adequate time and temperature controls
      • (iv) Providing adequate physical protection of components from contaminants that may drip, drain, or be drawn into them.
      • (v) Cooling to an adequate temperature during manufacturing.
      • (vi) Disposing of batters at appropriate intervals to protect against the growth of microorganisms.
    13. Filling, assembling, packaging, and other operations shall be performed in such a way that the food is protected against contamination. Compliance with this requirement may be accomplished by any effective means, including:
      • (i) Use of a quality control operation in which the critical control points are identified and controlled during manufacturing.
      • (ii) Adequate cleaning and sanitizing of all food-contact surfaces and food containers
      • (iii) Using materials for food containers and food- packaging materials that are safe and suitable
      • (iv) Providing physical protection from contamination, particularly airborne contamination.
      • (v) Using sanitary handling procedures.
    14. Food such as, but not limited to, dry mixes, nuts, intermediate moisture food, and dehydrated food, that relies on the control of aw for preventing the growth of undesirable microorganisms shall be processed to and maintained at a safe moisture level. Compliance with this requirement may be accomplished by any effective means, including employment of one or more of the following practices:
      • (i) Monitoring the aw (water activity) of food.
      • (ii) Controlling the soluble solids-water ratio in finished food.
      • ​(iii) Protecting finished food from moisture pickup, by use of a moisture barrier or by other means, so that the aw of the food does not increase to an unsafe level.
    15. Food such as, but not limited to, acid and acidified food, that relies principally on the control of pH for preventing the growth of undesirable microorganisms shall be monitored and maintained at a pH of 4.6 or below. Compliance with this requirement may be accomplished by any effective means, including employment of one or more of the following practices:
      • (i) Monitoring the pH of raw materials, food in process, and finished food.
      • (ii) Controlling the amount of acid or acidified food added to low-acid food.
    16. When ice is used in contact with food, it shall be made from water that is safe and of adequate sanitary quality, and shall be used only if it has been manufactured in accordance with current good manufacturing practice as outlined in this part.
    17. Food-manufacturing areas and equipment used for manufacturing human food should not be used to manufacture nonhuman food-grade animal feed or inedible products, unless there is no reasonable possibility for the contamination of the human food.

    21 CFR 110.80 e [51 FR 22475, June 19, 1986, as amended at 65 FR 56479, Sept. 19, 2000]


    Watch the video: : Αυτό που συμβαίνει στη Δύση είναι παλαβό - Είναι ρατσισμός αλλά από την αντίθετη πλευρά (February 2023).