header logo image


Page 10«..9101112..2030..»

Archive for April, 2020

A possible new link between OCD and the immune system – Big Think

Friday, April 24th, 2020

There has been a suspicion for some time that the immune system is somehow involved in the development of certain psychological disorders. Now a new study from Queen Mary University in London and led by Fulvio D'Acquisto has identified in mice a specific autoimmune protein that may trigger OCD's anxiety and stress in humans. "Our findings overturn a lot of the conventional thinking about mental health disorders being solely caused by the central nervous system," says D'Acquisto.

The study is published in Brain, Behavior, and Immunity.

Image source: Hanna Xu/unsplash

"There is mounting evidence that the immune system plays an important role in mental disorders," easy D'Acquisto. "And in fact, people with auto-immune diseases are known to have higher than average rates of mental health disorders such as anxiety, depression and OCD."

These potential linkages can be difficult to definitively affirm. Depression and anxiety, for example, may just as easily be understandable reactions to the autoimmune conditions' onset and not mental disorders. Still, as the study notes:

Image source: Kuttelvaserova Stuchelova/Shutterstock

At the heart of the new study's findings lies a protein the researchers call Immuno-moodulin, or Imood. An excess of this protein produced unusually anxious mice.

D'Acquisto and his colleagues stumbled across Imood by accident. Their intention was to investigate the role of another protein, Annexin-A1, in the development of multiple sclerosis and lupus. To that end, the researchers bred mice in which Annexin-A1 was being over-expressed in their immune systems' T-cells. Unexpectedly, these transgenic mice seemed more than typically anxious. Curious, the team analyzed the T-cells' genes and found one protein that was particularly active Imood.

The researchers' hunch was confirmed with the administering of an Imood antibody the mice calmed down in a few days.

Image source: Priscilla Du Preez/unsplash

Obviously, such findings in mice wouldn't necessarily apply to human beings. D'Acquisto's team decided to look for Imood in 23 OCD outpatients from the OCD tertiary outpatient Clinic of the University Department of Psychiatry of Milan, Policlinico Hospital. There were also 20 "normal" patients tested as a control group.

The researchers found the Imood amounts in the OCD patients were roughly six times higher than in the control group.

According to a Queen Mary University press release, D'Acquisto's research joins that of other scientists who identified the same protein as being over-expressed in patents with Attention-Deficit/Hyperactivity Disorder.

The mechanism behind the connection between Imood and OCD isn't yet clear. D'Acquisto suspects it's less a matter if direct alteration of brain function, and is more likely to be some influence exerted over brain cells already linked to mental disorders. He says, "This is work we still have to do to understand the role of Imood. "We also want to do more work with larger samples of patients to see if we can replicate what we saw in the small number we looked at in our study."

Related Articles Around the Web

Continue reading here:
A possible new link between OCD and the immune system - Big Think

Read More...

There’s an Emerging, Promising Link Between Exercise and Your Immune Health – POPSUGAR

Friday, April 24th, 2020

Immune health is on our minds these days, which means you've probably been seeing lots of supplements, fancy foods, and special drinks touted as effective or natural ways to boost your immunity. The truth is that scientists haven't found a direct link between lifestyle changes and increased immune capacity, at least not yet. But there are still some intriguing correlations to explore, especially when it comes to exercise.

"Your immune system can improve when you work out," said Aruna Subramanian, MD, an infectious-disease doctor and clinical medical professor at Stanford. It's true: though research is still emerging, there's a link between increased immune health and exercise. One comprehensive 2019 review concluded that moderate to vigorous exercise, done in sessions of less than an hour, has a positive impact on your immune system, that moderate exercise is associated with a decreased risk of illness, and that exercise overall has an anti-inflammatory effect on your body. While the exact connection and reason are still considered inconclusive, working out regularly is clearly helpful to your body's line of defense.

The same review did note that unusually intense exercise could have the opposite effect, making you more susceptible to illness. In other words, it's better to stick with a more moderate routine if you're trying to follow the best path for your immune system, particularly if you're not used to high-intensity exercise. If you're a true beginner, start within your abilities and ramp up slowly. (This will also help prevent injuries.)

Exercise is also known to have significant psychological benefits, said Dean Winslow, MD, an infectious-disease doctor at Stanford Health Care, which can itself be good for your immunity. That's because chronic stress suppresses your immune system by increasing inflammation and creating an emotional strain that keeps you from your usual healthy habits (think: adequate sleep and eating healthy). Exercise of all kinds is recommended to relieve intense stress, another win-win way it can help your immune system.

So while the science is still being sorted out, keeping up your normal fitness routine, or slowly easing into a new one, can certainly be a positive for your immune system and your overall health. Try this month-long at-home workout plan to get started.

See more here:
There's an Emerging, Promising Link Between Exercise and Your Immune Health - POPSUGAR

Read More...

Harnessing the Human Immune System: Why Antibody Treatments Might Just Work Against COVID-19 – Cornell University The Cornell Daily Sun

Friday, April 24th, 2020

In the race to develop a safe, effective treatment for COVID-19, biotech companies like Regeneron and Vir Biotechnology, led by Cornell alumni, have turned to antibodies which are naturally created by the human immune system as a form of therapeutic treatment.

But what are antibodies, and how can they be repurposed into drugs to help people recover from COVID-19?

Prof. Avery August, immunology, researches how the immune system responds to infection, and broke down each companys approach to antibody treatments and why they might work as a treatment for the coronavirus.

Antibodies are proteins made by specific cells in the immune system. The immune system produces antibodies when it comes into contact with something foreign to the body and then tries to identify that molecule or pathogen for the future, August said. These antibodies are used to target and stop any infections.

The antibodies are produced and then circulate in the blood, and act to protect you if you get exposed to that [foreign] thing again, August said. Antibodies are also what an individuals immune system produces in response to vaccines, which are dead or altered forms of a pathogen.

August noted that antibodies are especially useful because they can be collected from blood and used for antibody treatments.

Since antibodies are so large, they are difficult to chemically synthesize in a lab. So, antibodies produced for drug therapies are usually made by either bacteria or cell lines both have been genetically engineered to produce antibodies, according to August. Often, these antibodies are integrated into a mixture, usually referred to as antibody cocktails, that can be used as a treatment for viral infections.

Theyre made in these huge vats where the cells are grown andsecrete the antibodies, August said. The [secreted] fluid is collected, and the antibodies are then purified from that fluid and packaged as drugs.

August explained that since mice have an immune system similar to humans, scientists can utilize this similarity by exposing mice to a specific target, allowing the mice to develop an immune response, and then collecting the mices B cells.

B cells, or antibody-producing immune cells can then be grown in a culture, and antibodies specific to the desired target can be purified from the cells and used as a therapeutic.

However, this method of antibody production has one caveat.

The problem with using antibodies from mice is that a mouse antibody is slightly different from a human antibody, Avery said. You can use [them] as a drug, but eventually, humans start to make an immune response against the mouse antibody. So it stops working.

To overcome this hurdle, Regeneron and a few other biotech companies have genetically engineered mice to have a human immune system, so mice can directly produce human antibodies in response to a target.

According to Regenerons website, this is accomplished by replacing the genetic coding for a specific part of mouse antibodies with counterpart genes that are expressed in humans, allowing for the rapid and robust production of fully human antibodies.

Although this process of identifying, testing and producing antibodies as therapeutics can sometimes take years, Regenerons approach of using mice with human immune systems significantly accelerates the process because any identified antibodies can then be quickly converted into drugs for humans, August added.

In the pastwhen we didnt have these mice that have human immune systems, we had to generate the antibodies in mice, and then genetically engineer those antibodies so they look like human antibodies, which can take some time, and then make them into drugs, August explained.

In developing treatments for COVID-19, Regeneron is currently selecting antibody candidates to test a mixture of two antibody treatments for human clinical studies. The studies are slated to begin by early summer. Regeneron is also currently conducting clinical trials to evaluate the use of its drug Kevzara in the potential treatment of critical COVID-19 patients based on clinical data from China.

Vir Biotechnology has a different approach to developing antibody treatments. Instead of combining two antibodies in a mixture, Vir is selecting one antibody candidate and modifying it to allow for the antibodies to last longer in the body and produce white blood cells, which can lead to long-term immunity. In the search for antibody candidates, Virs goal is to find pan-coronavirus antibodies that could work in most future coronavirus outbreaks.

Regeneron and Vir also adopted the approach of isolating antibodies from the blood of humans who have already recovered from COVID-19.

According to August, there are two reasons for collecting blood from recovered coronavirus patients. One reason is that antibodies can be purified from the blood of those individuals and then be used to treat other humans.

However, using antibodies from recovered coronavirus patients has its limits. This treatment method is extremely expensive and it takes approximately two to four donors to treat one person, August said.

The second reason is that since the blood from recovered COVID-19 patients contain the B cells producing antibodies against the virus, those B cells can be identified and purified from the blood. Antibodies can then be genetically engineered in the lab to produce a drug people can use.

Although costly, antibody treatments are still a key player in the search for a drug that can treat COVID-19.

Small-molecule drugs are cheaper to make and easier to administer in the form of pills, but run the risk of losing their effectiveness if the virus mutates, August said. Antibodies are more likely to provide effective, longer-term protection since they specifically target the non-mutating regions of the virus, but need to be administered through injections.

While the course of this pandemic is uncertain, there is promise in antibody treatments. Mixtures of antibodies, usually referred to as antibody cocktails, have been instrumental in combating viral infections and have been the focus of Ebola treatment research.

Continued here:
Harnessing the Human Immune System: Why Antibody Treatments Might Just Work Against COVID-19 - Cornell University The Cornell Daily Sun

Read More...

Monitoring the immune system to fight COVID-19: CD4 status, lymphopenia, and infectivity – Science Magazine

Friday, April 24th, 2020

30 April 2020

12:00 p.m. ET

Register now!

Andrea Cossarizza, M.D., Ph.D.

University of Modena and Reggio Emilia School of MedicineModena, Italy

Maurice OGorman, Ph.D., M.B.A., (D)ABMLI

Children's Hospital Los Angeles,Los Angeles, CA

Lishomwa (Lish) Ndhlovu, M.D., Ph.D.

Weill Cornell MedicineNew York, NY

Sean Sanders, Ph.D.

Science/AAASWashington, DC

The COVID-19 pandemic has struck the global population with unparalleled speed and ferocity. Researchers around the world are scrambling to learn about the biology, pathology, and genetics of SARS-CoV-2the novel coronavirus responsible for COVID-19while clinicians are seeking treatments, old and new, that might slow its infectivity and deadliness. In this webinar, we will explore what scientists are learning by using flow cytometry to study patients with COVID-19 in order to elucidate risk and disease severity. These experts are global leaders in cytometry and infectious disease, working on the frontlines of the COVID-19 outbreaks. They will provide concrete examples of how flow cytometry has been harnessed to provide key laboratory evidence that can be used in the fight against SARS-CoV-2 and COVID-19. Viewers will have the opportunity to put their questions to the expert panel during the live broadcast.

During the webinar, attendees will:

This webinar will last for approximately 60 minutes.

University of Modena and Reggio Emilia School of MedicineModena, Italy

Dr. Cossarizza completed his M.D. degree at the University of Padova in Italy before receiving a Ph.D. in oncology from the University of Modena and Reggio Emilia (UNIMORE) and the University of Bologna, also in Italy. After specializing in clinical pathology at UNIMORE, he obtained an associate professorship there. In 2005, he was appointed a professor in the international Ph.D. program at the University of Valencia in Spain, where he later became a research professor. In 2010, he became a full professor in pathology and immunology in the Faculty of Medicine at UNIMORE. He is a member of several editorial boards of international journals, and in 2016 was elected president of the International Society for Advancement of Cytometry. His primary research focus is identifying the molecular and cellular basis for the involvement of the immune system in diseases and infections, including HIV/AIDS and sepsis, as well as its role in pathophysiological conditions related to aging and neurodegeneration. Dr. Cossarizza has notable experience in the development and use of new flow cytometry approaches in immunological research.

Children's Hospital Los Angeles,Los Angeles, CA

Dr. OGorman earned his Masters and Ph.D. at the University of British Columbia before completing a postdoctoral fellowship at the University of North Carolina at Chapel Hill. He then joined the faculty at the Feinberg School of Medicine at Northwestern University, during which time he earned his MBA from Northwestern and served as vice chair of Pathology and Laboratory Medicine and director of Diagnostic Immunology and Flow Cytometry at Childrens Memorial Hospital in Chicago. He is currently chief of laboratory medicine, as well as director of the Clinical Lab and the Diagnostic Immunology and Flow Cytometry Laboratory at Childrens Hospital Los Angeles, and a professor of pathology and pediatrics at the Keck School of Medicine of the University of Southern California. Dr. OGormans research interests include immunopathogenesis of immune systemrelated disorders, investigation of immune mechanisms of immune suppression withdrawal in liver transplant patients, and the development of novel immune-related diagnostic laboratory tests. Additionally, he provides ad hoc reviews for multiple journals, including Cytometry,Journal of Leukocyte Biology, Journal of Immunological Methods, Clinical and Diagnostic Laboratory Immunology, and Archives of Pathology & Laboratory Medicine.

Weill Cornell MedicineNew York, NY

Dr. Ndhlovu is a professor of immunology at Weill Cornell Medicine in New York and principal investigator of the HIV and Emerging Pathogens Immunopathogenesis Laboratory in the Division of Infectious Diseases, also at Weill Cornell. A translational immunologist, he leads a research team dedicated to confronting the challenges of HIV and aging, with an emphasis on limiting disease complications and developing curative strategies. His program is now bringing the same urgency and focus to the COVID-19 pandemic, using both single-cell and epigenetic approaches to resolve molecular mechanisms regulating viral entry of SARS-CoV-2 infection across different tissues and cell types. His work seeks to identify therapeutic host targets and future therapies that reduce morbidity and mortality, and relieve the burden of this disease on society. Dr. Ndhlovu completed his undergraduate degree at the University of Zambia, his medical training at the University of Zambia Medical School, and his doctorate at Tohoku University School of Medicine in Japan.

Science/AAASWashington, DC

Dr. Sanders did his undergraduate training at the University of Cape Town, South Africa, and his Ph.D. at the University of Cambridge, UK, supported by the Wellcome Trust. Following postdoctoral training at the National Institutes of Health and Georgetown University, Dr. Sanders joined TranXenoGen, a startup biotechnology company in Massachusetts working on avian transgenics. Pursuing his parallel passion for writing and editing, Dr. Sanders joined BioTechniques as an editor, before joining Science/AAAS in 2006. Currently, Dr. Sanders is the Director and Senior Editor for Custom Publishing for the journal Science and Program Director for Outreach.

Read the rest here:
Monitoring the immune system to fight COVID-19: CD4 status, lymphopenia, and infectivity - Science Magazine

Read More...

Coronavirus: Does ‘boosting’ your immune system really help fight off COVID-19? – Newshub

Friday, April 24th, 2020

Does strong immunity help you fight off COVID-19?

COVID-19 is caused by the SARS-CoV-2 virus. Just like any foreign bug, the body will defend itself against the invader. Strong immunity is built on a healthy gut microbiome and an army of white blood cells. If someone is consuming a healthy diet based on an array of fruits, vegetables and wholefoods (foods in their whole and unprocessed form - e.g. a potato instead of fries), the immune system should be better-equipped to fight off the virus - or any illness, according to the Heart Foundation.

Hence, maintaining healthy immune function cannot be achieved by scoffing chips, biscuits and packaged dinners every day, "balanced" by a probiotic, Berocca and lemon water. It's built through a healthy lifestyle.

Vitamin C is widely touted for its immunity benefits, but if you're consuming enough fruit and veg, a supplement is unnecessary. Scientists in China are currently looking into whether ultra-high doses of vitamin C can help COVID-19 patients fight infection, but results will not be available until later this year.

In the meantime, the daily recommended intake can be achieved through citrus fruits, capsicum and greens such as broccoli and spinach. Unlike a pure vitamin C supplement, these foods also contain other vitamins and minerals that play an important part in keeping your immune system strong.

There are also three tried and trusted methods to supporting your immune system - reducing stress, getting enough sleep and exercising regularly.

Kombucha tastes great, it's trendy and there are a number of options on the market that are relatively inexpensive. However, it's not a magic tonic - and drinking it by the litre is not going to ward off COVID-19.

Like probiotics, kombucha contains live microorganisms. However, no studies have ever confirmed whether the drink has a high enough concentration to be considered a probiotic, and there is currently no evidence that kombucha can treat or prevent any illnesses.

To strengthen one's gut health and immunity, a far more pragmatic bet is opting for probiotic foods such as plain, unsweetened yoghurt, which is full of live cultures, and fermented products such as kefir and sauerkraut.

"There is no evidence to suggest that supplements labelled as immune-boosting such as green tea, zinc, elderberry or echinacea will provide any protection against COVID-19. Its more important to have a healthy lifestyle overall," Hursthouse wrote.

However, a vitamin D supplement can prove useful, particularly in parts of the world where sunshine is limited. Several studies have linked low vitamin D levels to a higher risk of respiratory infections. Vitamin D deficiencies are fairly common, and can be discovered through a blood test.

But again, if there isn't a deficiency, a supplement is not entirely necessary. As BBC Future reported in 2016, vitamin supplements typically don't provide any benefits in already healthy people.

And of course, prevention is always a good place to start. To minimise your chances of contracting COVID-19, follow the Ministry of Health's guidelines:

See the original post here:
Coronavirus: Does 'boosting' your immune system really help fight off COVID-19? - Newshub

Read More...

Coronavirus This Week: What is antibody testing and how it may impact the COVID-19 pandemic – Foster’s Daily Democrat

Friday, April 24th, 2020

A great deal of discussion is taking place regarding antibody testing for the COVID-19 coronavirus. A basic discussion of immunology is helpful to understand the value of such testing.

What are antibodies? How do we develop antibodies? How do antibodies work? What is an antibody test? What are the clinical and public health benefits of antibody testing?

Antibodies are specialized humoral proteins made by the immune system. They help the body fight against infections and disease by "recognizing" viruses, bacteria and infected cells. Each antibody binds to a specific antigen associated with a danger signal in the body. This antigen is also known as the antibody's target. When a foreign protein like a virus enters the body, the immune system responds to this antigen. Specific proteins are developed to fight off and hopefully neutralize or kill the invader. This is technically called humoral immunity.

There are three ways to obtain immunity to various disease agents. Each of these will play a role as an antibody test for the COVID-19 coronavirus becomes widely used.

First is natural active immunity. This immunity is derived by naturally becoming infected by a pathogen, such as a virus or a bacteria. When infection occurs, humoral immunity kicks in, developing antibodies specific to this disease. When the body is exposed to this same pathogen (antigen) in the future, the immune system releases those specific antibodies to attack the pathogen before infection can take place. An example is a person developing a case of measles from exposure to the virus in the natural environment. This type of immunity is usually long-lasting. Unfortunately, not every infection with a pathogen results in developing antibodies that are protective or "neutralizing." For example, infection with the bacteria that causes Lyme Disease does not result in the development of protective antibodies, thus we can become reinfected with this bacteria upon repeated exposure.

Second is artificial active immunity. This type of immunity is developed by artificially exposing a person to a pathogen and causing the immune system to actively develop antibodies against this specific pathogen. This is accomplished through the process of immunization or vaccination. A vaccine, which is either a killed or very weakened version of the pathogen is introduced into a person, via injection, orally, nasal mist etc. When the vaccine (antigen) enters the body, the immune system actively begins to develop protective neutralizing antibodies against this specific pathogen (antigen). Normally, immunity begins about two weeks after immunization. With immunization, it may take multiple doses of the vaccine over time to develop a sufficient level of antibodies to confer full immunity. This type of immunity may not be long-lasting, requiring booster immunizations in the future. An example is tetanus, where after the initial series if immunizations to develop adequate immunity, this immunity may wane over the years requiring a booster "shot" every 10 years.

Third, and the least used method of providing immunity, is artificial passive immunity. This type of immunity is obtained when a serum collected from individuals who were naturally infected by and recovered from a specific disease and contains the antibodies against that disease, is administered to a non-immune person. The individual receiving this serum does not actively produce their own antibodies, but passively accepts and uses the artificially introduced antibodies from the donors serum to attack the pathogen and prevent infection. An example is the administration of immune globulin to susceptible individuals exposed to Hepatitis A. If given soon after exposure, this immune globulin containing antibodies against Hepatitis A will prevent infection and disease. This type of immunity is shorted-lived, usually providing only a few months of protection.

Each of these three methods of gaining immunity are being and will be utilized in various ways to combat the COVID-19 pandemic once a reliable antibody test is widely available to the medical and public health community. The FDA has recently licensed a number of these tests. The test must reliably be able to detect antibodies to the COVID-19 coronavirus, and that the level of antibodies in the person tested is sufficient to provide immunity. It usually takes about four weeks after infection to develop detectable antibodies to the COVID-19 coronavirus. Since this virus is so new, it is not yet fully understood how long such immunity will remain active, and if it will protect against the virus as it changes. If this virus behaves like similar coronaviruses, it is expected that immunity will develop for some period of time and that it will also be conferred against a changing virus.

The antibody test is already being used to identify individuals whom have developed immunity to COVID-19. This information will allow individuals, especially those in critical occupations such as health care, first responders and public safety to return to work more safely and quickly after infection and recovery. This will be expanded to other occupations as the testing becomes more widely used, to allow greater re-opening of the economy.

The test is also being used to identify individuals with antibodies to the COVID-19 coronavirus who would donate serum to be used to develop an immune globulin as described above. This serum globulin is being administered to critically ill COVID-19 patients to reduce symptoms and prevent death. Once available on a larger scale, more people with COVID-19 coronavirus antibodies will be recruited to donate serum to make larger quantities of this immune globulin that could be administered to susceptible individuals exposed to the virus, such as health care workers treating COVID-19 patients.

This antibody test will be used for public health surveillance and to determine how widely the COVID-19 epidemic has spread throughout the nation. Studies called sero prevalence studies will be conducted by testing large numbers of blood samples taken from people across the nation to better understand how much immunity to this virus exists. Blood will be tested from samples taken at blood donation centers, other clinical settings, and by recruiting a sufficiently large representative sample of the nation to volunteer. This information will determine which parts of the country have lower rates of immunity and may be more at risk if a second wave of disease strikes. This would provide information that will be used to prepare these at-risk areas before a second wave becomes a reality. This will allow better targeting and stockpiling of needed supplies and equipment in the areas where it may be most needed.

SEACOAST CORONAVIRUS NEWS IS FREE: This content is being provided for free as a public service to our readers. Sign up for our free daily or breaking email newsletters and Seacoast Health newsletter to stay informed. Please support local journalism by subscribing to Seacoastonline.com or by subscribing to Fosters.com.

Lastly, the antibody test will be used to evaluate the effectiveness of any vaccine developed for the Covid-19 coronavirus. As trial vaccines move through the safety and effectiveness stages for final approval, humans will be administered the trial vaccine on a voluntary basis. The antibody test will be used to determine if these trial vaccines develop sufficient protective antibodies to safely prevent infection. The test will also be used to determine how many doses of the vaccine will be required, at what intervals will doses be administered, how long immunity lasts, and if booster "shots" will be required and, if so, how long after initial immunization.

Unfortunately, all these steps will be taking place for at least the next year or two. We must expand diagnostic testing for the virus, conduct contract tracing to identify sources of community transmission, isolate cases and quarantine contacts. Face coverings must be required for everyone working or entering public places of business, and good hand hygiene must be maintained. In the meantime we will need to live with reasonable but necessary limits on our daily lives. The better we manage the crisis now, the better of we will be down the road.

Rich DiPentima of Portsmouth spent more than 30 years as a public health official and epidemiologist, including service as deputy public health director in Manchester and chief of communicable disease epidemiology at the New Hampshire Division of Public Health. His column on coronavirus will appear weekly in Seacoast Health during the coronavirus public health crisis.

View original post here:
Coronavirus This Week: What is antibody testing and how it may impact the COVID-19 pandemic - Foster's Daily Democrat

Read More...

Everything we know about coronavirus immunity, and plenty we still don’t – STAT

Friday, April 24th, 2020

People who think theyve been exposed to the novel coronavirus are clamoring for antibody tests blood screens that can detect who has previously been infected and, the hope is, signal who is protected from another case of Covid-19.

But as the tests roll out, some experts are trying to inject a bit of restraint into the excitement that the results of these tests could, for example, clear people to get back to work. Some antibody tests have not been validated, they warn. Even those that have been can still provide false results. And an accurate positive test may be hard to interpret: the virus is so new that researchers cannot say for sure what sort of results will signal immunity or how long that armor will last.

They caution that policymakers may be making sweeping economic and social decisions plans to reopen businesses or schools, for example based on limited data, assumptions, and whats known about other viruses. President Trump last week unveiled a three-phased approach to reopen the country; he said some states that have seen declining case counts could start easing social distancing requirements immediately. And some authorities have raised the idea of granting immunity passports to people who recover from the virus to allow them to return to daily life without restrictions.

advertisement

Before we embark on huge policy decisions, like issuing immunity certificates to get people back to work, I think its good that people are saying, Hold up, we dont know that much about immunity to this virus, said Angela Rasmussen, a Columbia University virologist.

To be clear, most experts do think an initial infection from the coronavirus, called SARS-CoV-2, will grant people immunity to the virus for some amount of time. That is generally the case with acute infections from other viruses, including other coronaviruses.

advertisement

With data limited, sometimes you have to act on a historical basis, Anthony Fauci, the head of the National Institute of Allergy and Infectious Diseases, said in a webcast with JAMA this month. Its a reasonable assumption that this virus is not changing very much. If we get infected now and it comes back next February or March we think this person is going to be protected.

Still, the World Health Organization has stressed that the presumed immunity can only be proven as scientists study those who have recovered for longer periods. The agency is working on guidance for interpreting the results of antibody tests, also called serologic tests.

Right now, we have no evidence that the use of a serologic test can show that an individual is immune or is protected from reinfection, the WHOs Maria Van Kerkhove said at a briefing last week.

Below, STAT looks at the looming questions about antibodies and immunity that scientists are racing to answer.

What are antibody tests? How widely available are they? And how accurate?

The tests look for antibodies in the blood. Because antibodies are unique to a particular pathogen, their presence is proof the person was infected by the coronavirus and mounted an immune response. The hope is that the presence of the antibodies is an indication that the person is protected from another infection.

These are different from the tests used to diagnose active infections, which look for pieces of the virus genome.

Commercial antibody tests are starting to appear on the market, but so far, the Food and Drug Administration has only cleared a few through Emergency Use Authorizations. And already, health regulators are warning that the ones on the market may vary in their accuracy.

I am concerned that some of the antibody tests that are on the market that havent gone through FDA scientific review may not be as accurate as wed like them to be, FDA Commissioner Stephen Hahn said on Meet the Press earlier this month. He added that no test is 100% accurate, but what we dont want are wildly inaccurate tests.

Even the best tests will generate some false positives (identifying antibodies that dont actually exist) and some false negatives (missing antibodies that really are there). Countries including the U.K. have run into accuracy issues with antibody tests, slowing down their efforts for widespread surveys.

The fear in this case with imprecise tests is that false positives could errantly lead people to think theyre protected from the virus when they have yet to have an initial infection.

Serology testing isnt a panacea, said Scott Becker, the CEO of the Association of Public Health Laboratories. When its used, we need to ensure there are good quality tests used.

One specific concern with antibody tests for SARS-CoV-2: they might pick up antibodies to other types of coronaviruses.

Globally, there have only been a few thousand people exposed to the other coronaviruses that have caused outbreak emergencies, SARS and MERS. But there are four other coronaviruses that circulate in people and cause roughly a quarter of all common colds. Its thought that just about everyone has antibodies to some combination of those coronaviruses, so serological tests for SARS-CoV-2 would need to be able to differentiate among them.

What can be gleaned from serological results?

Detecting antibodies is the first step. Interpreting what they mean is harder.

Typically, a virus that causes an acute infection will prompt the bodys immune system to start churning out specific antibodies. Even after the virus is cleared, these neutralizing antibodies float around, ready to rally a response should that virus try to infect again. The virus might infect a few cells, but it cant really gain a toehold before the immune system banishes it. (This is not the case for viruses that cause chronic infections, like HIV and, in many cases, hepatitis C.)

The infection is basically stopped in its tracks before it can go anywhere, said Stephen Goldstein, a University of Utah virologist. But, Goldstein added, the durability of that protection varies depending on the virus.

Scientists who have looked at antibodies to other coronaviruses both the common-cold causing foursome and SARS and MERS found they persisted for at least a few years, indicating people were protected from reinfection for at least that long. From then, protection might start to wane, not drop off completely.

The experience with other viruses, including the other coronaviruses, has encouraged what Harvard epidemiologist Marc Lipsitch summed up as the educated guess in a recent column in the New York Times: After being infected with SARS-CoV-2, most individuals will have an immune response, some better than others. That response, it may be assumed, will offer some protection over the medium term at least a year and then its effectiveness might decline.

But many serological tests arent like pregnancy tests, with a yes or no result. They will reveal the levels (or titer) of antibodies in a persons blood. And thats where things can get a bit trickier. At this point, scientists cant say for sure what level of antibodies might be required for a person to be protected from a second Covid-19 case. They also cant say how long people are safeguarded, though its thought that a higher initial titer will take longer to wane than low levels.

Further investigation is needed to understand the duration of protective immunity for SARS-CoV-2, a committee from the National Academies of Sciences, Engineering, and Medicine wrote in a report this month.

Its not just whether someone is immune themselves. The next assumption is that people who have antibodies cannot spread the virus to others. Again, that hasnt been shown yet.

We dont have nearly the immunological or biological data at this point to say that if someone has a strong enough immune response that they are protected from symptoms, that they cannot be transmitters, said Michael Mina, an epidemiologist at Harvards T.H. Chan School of Public Health.

The challenge, as the National Academies report highlighted, is that no one knew about this virus until a few months ago. That means they havent been able to study what happens to people who recover from Covid-19 and if and how long they are protected for more than a short period of time.

One key uncertainty arises from the fact that we are early in this outbreak and survivors from the first weeks of infection in China are, at most, only three months since recovery, the report said.

What else can antibody tests show?

In addition to identifying those who have been infected, antibody tests can also suggest at a broader level how widely the virus has spread. These data have implications for how severe future outbreaks of cases might be and what kind of restrictions communities might need to live under. If more people have been infected than known a strong likelihood, given the number of mild infections that might have been missed and testing limitations in countries including the United States then more people are thought to be protected going forward.

In the United States, the Centers for Disease Control and Prevention and the National Institutes of Health have both launched serosurveys to assess how many people might have contracted the virus. Even employees of Major League Baseball teams have been enlisted in a study enrolling thousands of patients.

What have data from serosurveys shown thus far about antibody generation?

A number of countries have launched large serosurveys, so hopefully well have a better sense soon of the levels of antibodies being generated by individuals who recover from Covid-19 and among the general population. For now, though, there have only been limited data released from a couple small studies.

Scientists in Europe have pointed to strong antibody production in patients within a few weeks of infection. One study found that people were generally quick to form antibodies, which could help explain why the majority of people do not develop severe cases of Covid-19.

But one preprint released this month complicated the landscape. (Preprints have not been peer-reviewed or published yet in a research journal.) Researchers in Shanghai reported that of 175 patients with confirmed Covid-19, about a third had low antibody levels and some had no detectable antibodies. The findings suggest that the strength of the antibody response could correlate to the severity of infection, though thats not known for sure. They also raised concerns that those with a weaker antibody response might not be immune from reinfection.

But outside researchers have said that conclusions about immunity cant be drawn from what the study found. For one, there are different kinds of antibodies, so some might exist that the test wasnt looking for. Secondly, studies in other coronaviruses have shown that antibody responses vary from person to person, without clear implications for how protected someone is from another infection.

And, researchers say, antibodies are not the only trick the body has to protect itself. Immune cells also form memories after an initial infection and can be rallied quickly should that same pathogen try to strike again, even without antibodies or after antibody levels fade.

People that lose that serum neutralization it doesnt mean necessarily that theyre not going to have some level of immunity, said virologist Vineet Menachery of the University of Texas Medical Branch. Your immune system hasnt forgotten. It may just take them a couple of days to generate that immune response and be able to clear a virus.

He added that its likely that if and when protection starts to wane and people contract the coronavirus a second time, its likely to cause an even milder illness.

Ive heard reports of reinfection or reactivated virus. Whats going on there?

Health officials in some countries have said theyve seen examples of people recovering from Covid-19 only to test positive for the virus again what theyve taken to calling reactivation, to differentiate it from a second infection.

But experts are skeptical that either is occurring.

While no possibility can be eliminated at this early stage of the outbreak, they say that there are more likely explanations for a positive diagnostic test coming after a negative test.

For one: The tests used to diagnose Covid-19 look for snippets of the virus genome, its RNA. But what they cant tell you is if what theyre finding is evidence of live virus, meaning infectious virus. Once a person fights off a virus, viral particles tend to linger for some time. These cannot cause infections, but they can trigger a positive test. The levels of these particles can fluctuate, which explains how a test could come back positive after a negative test. But it does not mean the virus has become active, or infectious, again.

And two: the diagnostic tests typically rely on patient samples pulled from way back in their nasal passages. Collecting that specimen is not foolproof. Testing a sample that was improperly collected could lead to a negative test even if the person has the virus. If that patient then gets another test, it might accurately show they have the virus.

As Jana Broadhurst, the director of the Nebraska Biocontainment Units clinical laboratory, said, garbage in, garbage out.

Sharon Begley contributed reporting.

See more here:
Everything we know about coronavirus immunity, and plenty we still don't - STAT

Read More...

CHI, the Cleveland Health Institute – Cleveland Jewish News

Friday, April 24th, 2020

CHI, the Cleveland Health Institute, is dedicated to delivering health care with compassion, humility and integrity. We offer a full spectrum of services ranging from triaging acute care to diagnosing and managing chronic illness to practicing precise personalized medicine. We do this by emphasizing gene compatible lifestyles through predictive testing, preventive measures, personalized programs and patient education, enabling true partnership and full participation in creating health.

We have adopted the following in order to continue to care for our staff and CHI patients:

Telemedicine for scheduled consultations and follow up appointments

Secure email or fax for all administrative forms, patient histories and lab requisitions

Mail test kits with pre-paid labels to ship directly to labs

COVID-19 prevention recommendations and kit

Ship directly to our patients or provide curb side pickup for pharmaceutical grade targeted nutritional support for the immune system or to balance your neuroendocrine system to help with fear, anxiety, fatigue or depression

Emergency patient phone service 24/7

Upgrading CLE-CHI.com featuring an online store, contact us and ask your questions and the library of current relevant featured articles, like COVID 29 and Creating Maximum Immunity: A Vaccine will not Cure the Problem

Dr. Tonya S. Heyman, Medical Director

Here is the original post:
CHI, the Cleveland Health Institute - Cleveland Jewish News

Read More...

Personalized Medicine Market Is Projected To Expand At A Robust CAGR Of +11% By 2026 Analysis by Industry Outlook, Estimated Size, Valuable Share,…

Friday, April 24th, 2020

Personalized medicine companies seek to combat the scourge of cancer through personalized care, driving interest in patient-specific treatments that require testing.

The global personalized medicine (PM) market size was estimated at USD 1.57 trillion in 2018 and is anticipated to expand at a CAGR of +11% during the forecast period.

Personalized medicine promises a paradigm shift in diagnosis and care delivery as the treatment is based on data leveraged from a holistic view of an individual patient. Proliferation of sequencing methodologies, especially Next Generation Sequencing (NGS), due to rising cost of sequencing and development of Human Genome Project in genomics field is expected to drive the market.

Get Sample Copy of this Report Includes @: https://www.theresearchinsights.com/request_sample.php?id=45572

Key players profiled in the report include GE Healthcare; Illumina, Inc.; Asuragen, Inc.; Abbott Laboratories; Dako A/S; Exact Science Corporation; Danaher Corporation (Cepheid, Inc.);Decode Genetics, Inc.;Genelex Corporation; Exagen Diagnostics, Inc.; Precision Biologics, Inc.; QIAGEN; Celera Diagnostics LLC; and Biogen, Inc.

The main goal for the dissemination of this information is to give a descriptive analysis of how the trends could potentially affect the upcoming future of Personalized Medicine market during the forecast period. This markets competitive manufactures and the upcoming manufactures are studied with their detailed research. Revenue, production, price, market share of these players is mentioned with precise information.

In the geographic segmentation, the regions such as North America, Middle East & Africa, Asia Pacific, Europe and Latin America are given major importance. The top key driving forces of Personalized Medicine market in every particular market is mentioned with restraints and opportunities. The restraints are also given a counter act which prove to be an opportunity for this market during the forecast period of 2020 to 2026 respectively.

Get Discount on up to 40%: https://www.theresearchinsights.com/ask_for_discount.php?id=45572

Personalized Medicine market is also explained to the clients as a holistic snapshot of a competitive landscape within the given competitive forecast period. A comparative analysis of regional players and segmentations, which helps readers get a better understanding of the areas and resources with better understanding.

This report provides:

1) An overview of the global market for Personalized Medicine Market and related technologies.

2) Analyses of global market trends, with data from 2017, estimates for 2018 and 2019, and projections of compound annual growth rates (CAGRs) through 2026.

3) Identifications of new market opportunities and targeted promotional plans for Global Personalized Medicine Market.

4) Discussion of research and development, and the demand for new products and new applications.

5) Comprehensive company profiles of major players in the industry.

Table of Content

1 Introduction

2 Market Research Tactics

3 Market Summary

4 Quality Market Insights

5 Personalized Medicine Market Overview

6 Regulatory Market Synopsis

7 Personalized Medicine Market, By Application Analysis:

8 Personalized Medicine Market, By product Analysis:

9 Personalized Medicine Market, By End User Analysis:

10 Personalized Medicine Market, By Geographic Region

11 Competitive Landscape

12 Company Profiles

You Can Browse Full Report: https://www.theresearchinsights.com/enquiry_before_buying.php?id=45572

About usThe Research Insights A global leader in analytics, research and advisory that can assist you to renovate your business and modify your approach. With us, you will learn to take decisions intrepidly. We make sense of drawbacks, opportunities, circumstances, estimations and information using our experienced skills and verified methodologies. Our research reports will give you an exceptional experience of innovative solutions and outcomes. We have effectively steered businesses all over the world with our market research reports and are outstandingly positioned to lead digital transformations. Thus, we craft greater value for clients by presenting advanced opportunities in the global market.

Contact usRobinSales manager+91-996-067-0000[emailprotected]

Continue reading here:
Personalized Medicine Market Is Projected To Expand At A Robust CAGR Of +11% By 2026 Analysis by Industry Outlook, Estimated Size, Valuable Share,...

Read More...

Coronavirus: New plan would test 30 million per week and cost up to $100 billion, but ‘we’ve got to do it’ – CNBC

Friday, April 24th, 2020

Dr. Natalia Echeverri, (R) uses a swab to gather a sample from the nose of Sammy Carpenter, who said he is homeless, to test him for COVID-19 on April 17, 2020 in Miami, Florida.

Joe Raedle | Getty Images

An ambitious new plan to radically increase the number of coronavirus tests in the United States would see up to 30 million people screened each week and cost up to $100 billion to implement, a private foundation said Tuesday.

But that pricey effort for what one expert called "the largest public health testing in history" is necessary to stem the $300 billion to $400 billion in American economic losses each month as a result of the Covid-19 pandemic, the Rockefeller Foundation said.

It said the sooner coronavirus tests become much more widely available, the quicker the U.S. economy can start getting back to normal.

"We do have the capacity to do that, and we've got the resources to do that," said Dr. Michael Pellini, managing partner of health venture firm Section 32 and board member of the Personalized Medicine Coalition, who contributed to the foundation's new plan.

"Yes, it's ambitious, but at this point we've got to do it," Pellini said. "We have to fix testing in this country to enable our workforce to be deployed once again."

The plan comes amid calls by numerous experts and by CEOs to boost coronavirus testing to make sure businesses and social events can reopen safely without sparking second and third waves of virus outbreaks.

Amazon CEO Jeff Bezos, in a note to shareholders last week, wrote:Regular testing on a global scale, across all industries, would both help keep people safe and help get the economy back up and running."

Rockefeller Foundation President Dr. Rajiv Shah said: "We envision an America where everyone who needs a test can get one."

Rajiv Shah, president of The Rockefeller Foundation

Jason Alden | Bloomberg | Getty Images

"The Rockefeller Foundation believes that testing access is critical to scaling up our lives and economy," Shah said during a conference call with reporters Monday, when the total number of confirmed coronavirus cases reached more than 766,600, and the number of related fatalities approached 41,000.

He called the up-to-$100 billion cost of the effort "a modest investment," given the amount of monthly economic losses to the nation, as well as the societal costs related to the outbreak, which could end up increasing rates of suicides, alcoholism and domestic violence.

While the goal of the testing plan is to build a state-led national program, the foundation said funding for it likely can come from federal funds through agencies or grants.

The foundation itself is investing $15 million to help kick off the effort, which includes supporting cities that are among the first to adopt the plan's recommendations.

The Rockefeller Foundation,which is a major philanthropicdonorin areas including health and science, told CNBC last week that it had been in contact with the Trump administration, national groups of governors and mayors and leading American corporations as it prepared the recommendations.

The foundation's plan lays out a strategy for tripling, within the next eight weeks, the existing 1 million coronavirus test per week now being done by maximizing efficiencies in testing capacity.

After that, the foundation calls for multiplying those 3 million tests per week by at least 10 times to get to at least 30 million tests each week within the next six months.

Reaching that level will entail, among other things, removing regulatory barriers to approval for new point-of-care and home-test kits, and ensuring payments for labs performing the tests.

The Rockefeller plan says that more testing must be done to accurately capture the level of Covid-19 infections in the U.S.

"In Taiwan, there have been 132 tests conducted for every confirmed case. In Australia, the number is 62. In the United States, it is five," an executive summary of the plan notes.

"The unfortunate conclusion from this comparison is that the country's actual number of infections may be 15- to 20-times higher than the reported number of confirmed cases," the summary says.

"In short, the United States needs to increase the current level of 1 million tests per week by at least 10-fold and preferably 20-fold to adequately monitor the pandemic."

The plan notes that "given the commercial uncertainties inherent in this 10-fold increase in production" it is likely the federal government would need to activate the Defense Production Act to compel production of tests.

The plan calls for the creation of an Emergency Network for Covid-19 Testing to coordinate and underwrite the testing market with the use of leverage from public-private credit guarantees and other tools.

The second part of the plan envisions a paid Covid Community Healthcare Corps of 100,000 to 300,000 people to perform the high number of texts and conduct "contact tracing," or reaching out to individuals who have been around infected people and testing them as well.

The third part is a common data and digital platform to support the first two objectives by sharing "real-time analyses of resource allocations, disease tracing results and patient medical records."

New York University professor Paul Romer, who shared the 2018 Nobel Prize in economics, said the need for a huge increase in the number of coronavirus tests is driven not only by the need to stanch current economic losses but to prevent permanent damage to U.S. economic output when the country exits recession.

"Our future capacity to produce" is lower because of the economic cost of the pandemic "and it deteriorates with each month of delay," said Romer, who contributed to the Rockefeller Foundation's plan.

But Romer said the solution to the testing problem lies in the current system, which needs to be reorganized and incentivized financially to produce enough tests.

"We're really not constrained on the supply side, but we are constrained by what we're willing to pay," Romer said.

"We just need to pay people and let them have to option to provide these tests."

Romer made the analogy of the government saying that there needed to be 300 million soft drinks made one day.

No one would step up to make the drinks for free, he suggested, but they would be made if the government agreed to pay for them.

See the rest here:
Coronavirus: New plan would test 30 million per week and cost up to $100 billion, but 'we've got to do it' - CNBC

Read More...

Insights on the Cell Expansion Industry in North America to 2027 – by Product, Cell Type, Application, End-user and Country – GlobeNewswire

Friday, April 24th, 2020

Dublin, April 24, 2020 (GLOBE NEWSWIRE) -- The "North America Cell Expansion Market to 2027 - Regional Analysis and Forecasts by Product; Cell Type; Application; End User, and Country" report has been added to ResearchAndMarkets.com's offering.

The cell expansion market in North America is anticipated to reach USD 14,697.41 million by 2027 from USD 4,522.07 4 million in 2019; it is projected to grow at a CAGR of 15.9% during 2020-2027. The growth of the market is attributed to the increasing prevalence of cancer, rising number of new product launches, and increasing inclination of patients toward regenerative and personalized medicines. Also, growing R&D expenditure on cancer research is likely to have a positive impact on the growth of the market in the coming years. In addition, technological advancements in the pharmaceuticals industry and extensive developments in drug discovery are likely to stimulate the growth of cell expansion market in North America during the forecast period.

Cell expansion is the large-scale artificial production of daughter cells from a single cell, and the process is carried out to support the medical research. It plays a critical role in exploring a wider range of benefits and applications of fully differentiated stem cell cultures for their use in therapeutics, drug screening, or advanced research.

R&D is a significant part of a majority of pharmaceutical and biotech companies; they focus on R&D to come up with new molecules with the most significant medical and commercial potential for various therapeutic applications. The companies invest big amounts in these activities to deliver innovative, high-quality products to the market. Moreover, as per the report of Pharmaceutical Research and Manufacturers of America (PhRMA), the R&D expense of biopharmaceutical companies surged from US$ 49.6 billion in 2012 to US$ 58.8 billion in 2015.

Several government organizations are working on enhancing the detection methods and treatment procedures of cancer in the region. The National Cancer Institute (NCI) spends on various categories of the treatment, including specific cancer sites, cancer types, and cancer-related diseases, as well as types of NCI research mechanisms. The NCI allocated the funds of ~US$ 208.4 million for cell expansion research in 2017 from their total budget of US$ 5,636.4 million in that year for cancer research studies. Therefore, the growing R&D expenditure on cancer research by these companies is expected to provide them with opportunities for business expansion.

The North American cell expansion market has been segmented on the basis of cell type into human cells and animal cells. The human cells segment held a larger share of the market in 2018, and it is also projected to register a higher CAGR in it during the forecast period. Rise in research activities for the treatment of cancer is expected to offer considerable growth opportunities for the human cell expansion market players.

A few of the important secondary sources referred to for preparing this report on the cell expansion market are World Health Organization (WHO), Food and Drug Administration (FDA), Canadian Cancer Society, Centers for Disease Control and Prevention (CDC), and American Cancer Society.

Reasons to Buy:

Key Topics Covered:

1. Introduction1.1 Scope of the Study1.2 Report Guidance1.3 Market Segmentation1.3.1 North America Cell Expansion Market - By Product1.3.2 North America Cell Expansion Market - By Cell Type1.3.3 North America Cell Expansion Market - By Application1.3.4 North America Cell Expansion Market - By End User1.3.5 North America Cell Expansion Market - By Country

2. North America Cell Expansion Market- Key Takeaways

3. Research Methodology3.1 Coverage3.2 Secondary Research3.3 Primary Research

4. North America Cell Expansion Market - Market Landscape4.1 Overview4.2 PEST Analysis4.2.1 Cell Expansion Market - North America PEST Analysis4.3 Expert Opinion

5. North America Cell Expansion Market - Key Market Dynamics5.1 Key Market Drivers5.1.1 Patient shift towards regenerative medicines5.1.2 Increasing number of patients suffering with cancer5.2 Key Restraints5.2.1 Risk of contamination associated with the cell expansion process5.3 Key Market Opportunities5.3.1 Growing R&D Expenditure for Cancer Research5.4 Future Trend5.4.1 Extensive development in drug discovery5.5 Impact Analysis

6. Cell Expansion Market - North America Analysis6.1 North America Cell Expansion Market Revenue Forecasts and Analysis6.2 Positioning Of Key Players

7. North America Cell Expansion Market Analysis And Forecasts To 2027 - Product7.1 Overview7.2 North America Cell Expansion Market, By Product 2018 & 2027 (%)7.2.1 North America Cell Expansion Market Revenue and Forecasts to 2027, By Product (US$ Mn)7.2.1.1 North America Consumables Market Revenue and Forecasts to 2027, By Type (US$ Mn)7.2.1.1.1 North America Disposables Market Revenue and Forecasts to 2027, By Type (US$ Mn)7.2.1.2 North America Instruments Market Revenue and Forecasts to 2027, By Type (US$ Mn)7.3 Consumables7.3.1 Overview7.3.2 North America Consumables Market Revenue and Forecast to 2027 (US$ Mn)7.3.3 Reagents, Media & Serum7.3.3.1 Overview7.3.3.2 North America Reagents, Media & Serum Market Revenue and Forecast to 2027 (US$ Mn)7.3.4 Disposables7.3.4.1 Overview7.3.4.2 North America Disposables Market Revenue and Forecast to 2027 (US$ Mn)7.3.4.3 Culture Tissue Flasks7.3.4.3.1 Overview7.3.4.3.2 North America Culture Tissue Flasks Market Revenue and Forecast to 2027 (US$ Mn)7.3.4.4 Bioreactor Accessories7.3.4.4.1 Overview7.3.4.4.2 North America Bioreactor Accessories Market Revenue and Forecast to 2027 (US$ Mn)7.3.4.5 Other Disposables7.3.4.5.1 Overview7.3.4.5.2 North America Other Disposables Market Revenue and Forecast to 2027 (US$ Mn)7.4 Instruments7.4.1 Overview7.4.2 North America Instruments Market Revenue and Forecasts to 2027 (US$ Mn)7.4.3 Cell Expansion Supporting Equipment7.4.3.1 Overview7.4.3.2 North America Cell Expansion Supporting Equipment Market Revenue and Forecast to 2027 (US$ Mn)7.4.4 Bioreactors7.4.4.1 Overview7.4.4.2 North America Bioreactors Market Revenue and Forecast to 2027 (US$ Mn)7.4.5 Automated Cell Expansion Systems7.4.5.1 North America Automated Cell Expansion Systems Market Revenue and Forecast to 2027 (US$ Mn)

8. North America Cell Expansion Market Analysis And Forecasts To 2027 - Cell Type8.1 Overview8.2 North America Cell Expansion Market, By Cell Type 2018 & 2027 (%)8.2.1 North America Cell Expansion Market Revenue and Forecasts to 2027, By Cell Type (US$ Mn)8.3 Human Cells8.3.1 Overview8.3.2 North America Human Cells Market Revenue and Forecast to 2027 (US$ Mn)8.3.3 Adult Stem Cells8.3.3.1 Overview8.3.3.2 North America Adult Stem Cells Market Revenue and Forecast to 2027 (US$ Mn)8.3.4 Induced Pluripotent Stem Cells8.3.4.1 Overview8.3.4.2 North America Induced Pluripotent Stem Cells Market Revenue and Forecast to 2027 (US$ Mn)8.3.5 Embryonic Stem Cells8.3.5.1 Overview8.3.5.2 North America Embryonic Stem Cells Market Revenue and Forecast to 2027 (US$ Mn)8.3.6 Differentiated Cells8.3.6.1 Overview8.3.6.2 North America Differentiated Cells Market Revenue and Forecast to 2027 (US$ Mn)8.4 Animal Cells8.4.1 Overview8.4.2 North America Animal Cells Market Revenue and Forecast to 2027 (US$ Mn)

9. North America Cell Expansion Market Analysis- By Application9.1 Overview9.2 North America Cell Expansion Market, By Application 2018 & 2027 (%)9.3 Regenerative Medicine and Stem Cell Research9.4 Cancer and Cell-based Research9.5 Other Applications

10. North America Cell Expansion Market Analysis- By End User10.1 Overview10.2 North America Cell Expansion Market, By End User 2018 & 2027 (%)10.3 Biopharmaceutical And Biotechnology Companies10.4 Research Institutes10.5 Cell Banks10.6 Other End Users

11. Cell Expansion Market Revenue And Forecasts To 2027 - Geographical Analysis11.1 North America Cell Expansion Market, Revenue and Forecast to 202711.1.1 North America Cell Expansion Market, Revenue and Forecast to 2027 (US$ Mn)11.1.2 North America Cell Expansion Market, Revenue and Forecast to 2027, By Product (US$ Mn)11.1.2.1 North America Consumables Market, Revenue and Forecast to 2027, By Type (US$ Mn)11.1.2.1.1 North America Disposables Market, Revenue and Forecast to 2027, By Type (US$ Mn)11.1.2.2 North America Instruments Market, Revenue and Forecast to 2027, By Type (US$ Mn)11.1.3 North America Cell Expansion Market, Revenue and Forecast to 2027, By Cell Type (US$ Mn)11.1.3.1 North America Human Cell Market, Revenue and Forecast to 2027, By Type (US$ Mn)11.1.4 North America Cell Expansion Market, Revenue and Forecast to 2027, By Application (US$ Mn)11.1.5 North America Cell Expansion Market, Revenue and Forecast to 2027, By End User (US$ Mn)11.1.6 North America Cell Expansion Market, Revenue and Forecast to 2027, By Country (%)11.1.7 US11.1.8 Canada11.1.9 Mexico

12. North America Cell Expansion Market- Industry Landscape12.1 Overview12.2 Growth Strategies In The Cell Expansion Market, 2017-201912.3 Organic Growth Strategies12.3.1 Overview12.3.1.1 Recent Organic Developments By Players In The Cell Expansion Market12.4 Inorganic Growth Strategies12.4.1 Overview12.4.2 Recent Developments By Players In The Cell Expansion Market

13. Global Cell Expansion Market-Key Company Profiles13.1 BD13.1.1 Key Facts13.1.2 Business Description13.1.3 Financial Overview13.1.4 Product Portfolio13.1.5 SWOT Analysis13.1.6 Key Developments13.2 Merck KGaA13.3 Thermo Fisher Scientific, Inc.13.4 Terumo Corporation13.5 General Electric Company13.6 Corning Incorporated13.7 Miltenyi Biotec13.8 Danaher13.9 Lonza13.10 STEMCELL Technologies, Inc.

14. Appendix14.1 About the Publisher14.2 Glossary of Terms

For more information about this report visit https://www.researchandmarkets.com/r/gq37sj

Research and Markets also offers Custom Research services providing focused, comprehensive and tailored research.

The rest is here:
Insights on the Cell Expansion Industry in North America to 2027 - by Product, Cell Type, Application, End-user and Country - GlobeNewswire

Read More...

The global artificial intelligence in healthcare market is set to register growth, projecting a CAGR of 38.05% during the forecast period, 2020-2028 -…

Friday, April 24th, 2020

NEW YORK, April 22, 2020 /PRNewswire/ --

KEY FINDINGSThe global artificial intelligence in healthcare market is set to register growth, projecting a CAGR of 38.05% during the forecast period, 2020-2028. The prominent drivers of market growth are estimated to be the rising big data in the healthcare industry, the growing use of AI in genetics, the emergence of personalized medicine in tests for clinical decision making, along with the creation of a real-time monitoring system due to AI.

Read the full report: https://www.reportlinker.com/p05242360/?utm_source=PRN

MARKET INSIGHTSThe utilization of AI in healthcare entails the use of software and algorithms for estimating the human perception for analyzing complex medical data, along with the relationship between treatments or prevention techniques and patient outcomes.The growing demand for real-time monitoring system is one of the key aspects propelling the growth of the global artificial intelligence in healthcare market.

The real-time monitoring devices like health monitoring devices or indicators track real-time health data of patients, which is increasing the demand for AI in healthcare.The devices also drive the relevancy of data interpretation and aid in reducing the time the patients spend in piecing data output.

In healthcare, the devices help in detecting and preventing undesirable patient outputs. The growing number of mobile devices integrated with artificial intelligence assists in the prediction of future outcomes with regard to health, which further benefits market growth.Medical practitioners are reluctant to adopt AI-based technologies, and this is restraining the growth of the market.The reluctance is because of the lack of data that identifies healthcare decisions.

Also, from a diagnostics point of view, AI systems fare less in terms of efficiency in comparison to conventional methods.The companies in the market are competing against each other by providing the same characteristics and similar prices.

The competitive rivalry is projected to be high during the forecast period.

REGIONAL INSIGHTSThe geographical segmentation of the global artificial intelligence in healthcare market includes the analysis of Europe, North America, Asia Pacific, and the rest of the world.Inkwood Research estimates the Asia Pacific region to be the fastest-growing region by the end of the forecast period.

The invention of new technologies, the presence of countries like China, Japan, Australia, and India, and the thriving artificial intelligence market, are the factors propelling the growth of the market.

COMPETITIVE INSIGHTSSome of the prominent companies operating in the market are Enlitic Inc, Next IT Corporation, Recursion, Welltok, GE Healthcare, Microsoft Corporation, etc.

Our report offerings include: Explore key findings of the overall market Strategic breakdown of market dynamics (Drivers, Restraints, Opportunities, Challenges) Market forecasts for a minimum of 9 years, along with 3 years of historical data for all segments, sub-segments, and regions Market Segmentation cater to a thorough assessment of key segments with their market estimations Geographical Analysis: Assessments of the mentioned regions and country-level segments with their market share Key analytics: Porter's Five Forces Analysis, Vendor Landscape, Opportunity Matrix, Key Buying Criteria, etc. Competitive landscape is the theoretical explanation of the key companies based on factors, market share, etc. Company profiling: A detailed company overview, product/services offered, SCOT analysis, and recent strategic developments

Companies mentioned1. DEEP GENOMICS INC2. ENLITIC INC3. GE HEALTHCARE4. GENERAL VISION INC5. GOOGLE6. IBM CORPORATION7. ICARBONX8. INTEL CORPORATION9. MICROSOFT CORPORATION10. NEXT IT CORPORATION11. NVIDIA CORPORATION12. ONCORA MEDICAL13. RECURSION PHARMACEUTICALS INC14. STRYKER CORPORATION15. WELLTOK INC

Read the full report: https://www.reportlinker.com/p05242360/?utm_source=PRN

About Reportlinker ReportLinker is an award-winning market research solution. Reportlinker finds and organizes the latest industry data so you get all the market research you need - instantly, in one place.

__________________________ Contact Clare: clare@reportlinker.com US: (339)-368-6001 Intl: +1 339-368-6001

View original content:http://www.prnewswire.com/news-releases/the-global-artificial-intelligence-in-healthcare-market-is-set-to-register-growth-projecting-a-cagr-of-38-05-during-the-forecast-period-2020-2028--301045674.html

SOURCE Reportlinker

More:
The global artificial intelligence in healthcare market is set to register growth, projecting a CAGR of 38.05% during the forecast period, 2020-2028 -...

Read More...

7 Beautiful Biotech Stocks to Buy Here – Investorplace.com

Friday, April 24th, 2020

Its pretty hard to ignore the impact of Covid-19 on the global and domestic economy. But the shutdown in the U.S. economy, while swift, was also swiftly managed by the administration and Congress, as well as the Federal Reserve.This has helped buoy the markets after a precipitous initial drop. And it has allowed enough time for investors to process everything that has happened and reset their expectations looking away from certain risks and into cleaning-supply companies and biotech stocks.

Previous to Covid-19, there was the feeling the economy was nearing recession sometime this year. Now were in one.

However, this remains an uneven market. Big stocks like Amazon (NASDAQ:AMZN) and Netflix (NASDAQ:NFLX) are doing even better under these difficult conditions. But other consumer stocks arent.

Thats why I wanted to talk about an opportunity that doesnt have to do with consumer-driven stocks the seven beautiful biotech stocks to buy here. These companies are set up to endure long drug approval processes that happen over good times and bad.

Theyre built to be immune from general market forces, especially the big one these days consumer spending and to benefit from powerful technology thats popping up in all kinds of industries.

Source: Catalin Rusnac/ShutterStock.com

CRISPR is a Swiss biotech that is one of the leaders in CRISPR technology. This is a new and fast-growing field. CRISPR (it stands for clustered regularly interspaced short palindromic repeats) is a DNA segment containing short repetitions of base sequences.

By using these, scientists are learning to identify where there is a break in the pattern that may signal a disease and then repair (or edit) that sequence. This is a revolutionary concept in treating disease, since it is the first step toward personalized medicine.

Currently the research is hoping to build treatments to help battle many different cancers and other diseases. But eventually the technology can be applied to a much broader field of needs.

The company is well funded and will continue to make a difference, even sequencing Covid-19.

The stock is up 37% in the past year and over 30% in the past month.

Source: Pavel Kapysh / Shutterstock.com

Acceleron is a biopharmaceutical company that has been around since 2003. Last year was a wild ride for the firm, and the volatility continued into 2020.

In September, one of its drugs in trials was rejected by the FDA and the stock tanked. Then, shortly after that, its drug luspatercept, which treats a rare blood disorder, was approved.

And then, in late January, data from a Phase 2 trial of another drug it has in the approval process for pulmonary arterial hypertension (PAH), showed significant positive results in treating the disorder. This is the kind of situation that can attract big buying pressure on Wall Street of the kind I want to see inGrowth Investor.

Needless to say, the stock is now up 134% in the past year, and 38% in the past month.

One of its largest shareholders is Bristol Myers Squibb (NYSE:BMY). Acceleron had been doing a lot of work with Celgene when BMY bought Celgene. This is a great partnership to have when looking to market, manufacture and distribute these new drugs.

Source: Shutterstock

Neurocrine Bioscienceshas been in business for over 25 years and specializes in developing treatments for people suffering with neurological, endocrine and psychological disorders.

It currently has treatments for tardive dyskinesia and endometriosis. It has candidates for Parkinsons, uterine fibroids and congenital adrenal hyperplasia. Last year, there was talk that Biogen (NASDAQ:BIIB) may have been looking at NBIX as an acquisition to build on its own work in some of these sectors.

Its a solid company that has a good balance of revenue-generating drugs and a solid, focused drug pipeline. Neurocrine has a unique niche and will remain an attractive takeover target for larger pharmaceutical companies and biotechs.

The stock is up 17% for the past year, and 20% in the past month.

Source: Jarretera / Shutterstock.com

Galapagos is a Belgian company that specializes in small molecule and antibody therapies.

It was a solid performer and has been around for over 20 years. But its big break happened last year when Gileadapproached the company and offered it a $5 billion partnership deal. This brought the stock to the attention of many U.S. investors who hadnt paid much attention to it.

And by December, that partnership paid off. GILD filed with FDA for speedy review of filgotinib, a potential blockbuster drug that GLPG developed and had in Europe in Japan but not the U.S. The U.S. is the market where the money is made, since pricing is far more dynamic than in other countries.

The approval came in December. And this could mean big things for both companies. But since then, Covid-19 has taken the spotlight and Gileads remdesivir has been all the buzz.

GLPG stock is up over 80% in the past 12 months, and almost 45% in the past month. Its management has also been wise enough to partner with a genomics company that provides whats sometimes nicknamed the mother of all technologiesto help discover new treatments and drug combinations.

Source: Eyesonmilan / Shutterstock.com

Incyte has been around since the early 1990s and is good-sized biopharmaceutical company with a $21 billion market capitalization.

Currently it has two drugs in the marketplace, one of those is in the US. Its big drug is Jakafi (ruxolitinib), which treats a rare form of blood cancer and is also approved to manage acute graft-versus-host disease in adults.

It also has a drug that it acquired from ARIAD Pharmaceuticals for European distribution. Its Iclusig (ponatinib) is used to treat leukemia, and INCY stock hopes to make the drug available in the U.S. after approval from the FDA.

Incyte also has a good number of drugs in the pipeline and has the financial wherewithal to keep moving them forward, even now.

The stock is up 36% in the past 12 months, and over 60% in the past month.

Source: madamF / Shutterstock.com

Regeneron is one of those biotech stocks that has been a direct Covid-19 beneficiary.

It has a number of drugs in the market and around 30 drugs in its pipeline. It has been around since 1988 and has a $62 billion market cap, so this is no one-trick pony rolling the dice on a potential blockbuster. It has built its reputation over time, delivering solid drugs in important sectors.

But the added juice at this point is its arthritis drug Kevzara that it co-markets with Sanofi (NASDAQ:SNY). It has been given to patients in China and New York (the co-founder of Regeneron is from Queens) but the testing isnt broad enough to deliver any conclusive information.

Obviously, the bet on Kevzara being a treatment is just that, a bet. But it has brought more attention to REGN, which is a quality pharma. And Regeneron also has its eye on the big picture of next-generation technology to supercharge its genomics research at the Regeneron Genetics Center.

The stock is up almost 62% in the past year and up 22% in the past month.

Source: Michael Moloney / Shutterstock.com

Gilead has been a big name among biotech stocks for a long time. It was a key player in finding a treatment for HIV/AIDS. And was also a pioneer in finding a highly effective once-a-day regimen for hepatitis C.

While it still makes a good amount of money from these blockbusters, it hadnt had a big hit in a while and the stock flat-lined as investors wondered if its best days were behind it.

But it has made some interesting acquisitions and partnerships in the past couple years, one of those being with Galapagos.

And now, remdesivir is in two Phase 3 clinical trials as a treatment for Covid-19. This, along with the promising partnership with GLPG stock thats already delivering results, promises that GILD could beback in the running with two potential blockbusters. Even one would be great news.

The stock is on the move, up 30% in the past 12 months, and 13% in the past month. And it is still off its 2015 highs, so theres plenty of headroom if either or both these drug live up to their promise.

Gilead is one of the big kahunas in this space, and now a household name, thanks to the fight against Covid-19. It, too, is partnering with smaller labs to harness the power of the game-changing technology of our time: artificial intelligence (AI).

If artificial intelligence sounds futuristic, even far-fetched well, keep in mind, youre already using it every day. If youve ever usedAlphabets(NASDAQ:GOOG, NASDAQ:GOOGL) Google Assistant orApples(NASDAQ:AAPL) Siri if youve hadNetflix(NASDAQ:NFLX) recommend a movie orZillow(NASDAQ:Z) recommend a house even an email spam filter then youve used artificial intelligence.

In this new world of AI everywhere, data becomes a hot commodity.

As scientists find even more applications for artificial intelligence from hospitals to retail to self-driving cars its incredible to imagine how much data will be involved.

To create AI programs in the first place, tech companies must collect vast amounts of data on human decisions. Data is what powers every AI system. As one AI researcher from the University of South Florida puts it, data is the new oil.

To cash in, youll want the company that makes the brain that all AI software needs to function, spot patterns and interpret data.

Its known as the Volta Chip and itswhat makes the AI revolution possible.

You dont need to be an AI expert to take part.Ill tell you everything you need to know, as well as my buy recommendation, inmy special report forGrowth Investor,The A.I. Master Key. The stock is still under my buy limit price so youll want to sign up now. That way, you can get in while you can still do so cheaply.

Click here for a free briefing on this groundbreaking innovation.

Louis Navellier had an unconventional start, as a grad student who accidentally built a market-beating stock system withreturns rivaling even Warren Buffett. In his latest feat, Louis discovered the Master Key to profiting fromthe biggest tech revolution of this (or any) generation. Louis Navellier may hold some of the aforementioned securities in one or more of his newsletters.

Read the rest here:
7 Beautiful Biotech Stocks to Buy Here - Investorplace.com

Read More...

genetics – Kids | Britannica Kids | Homework Help

Friday, April 24th, 2020

In the 1850s and 1860s an Austrian monk named Gregor Mendel studied pea plants in his garden. He found that there were rules for how traits passed from one generation of pea plants to the next. The rules are the same for every plant and animal. During his lifetime no one understood how important these findings were.

In 1900 people rediscovered Mendels work. From then on, the new science of genetics grew rapidly. Scientists began to use it to help explain the theory, or idea, of evolution. An English scientist named Charles Darwin had put forth the theory in the 1850s. It describes how species adapt to their environment and how new species form.

In 1953 James Watson of the United States and Francis Crick of England discovered the structure of DNA. Their studies helped scientists understand how genes work and how they make copies of themselves.

By the mid-1970s, scientists had learned how to locate, remove, and insert specific genes in DNA. This work is called genetic engineering. By the 1990s scientists could clone animals, or produce animals that have exactly the same DNA as another animal. In 1996 researchers in Scotland produced the first clone of an adult mammala sheep. Some scientists worked toward cloning human beings. But others saw this work as dangerous and wrong.

In 2003 a team of researchers finished a project to identify and locate all the genes in all human DNA. The results will help scientists in the study of human biology and medicine.

Original post:
genetics - Kids | Britannica Kids | Homework Help

Read More...

Genetics: The Study of Heredity | Live Science

Friday, April 24th, 2020

Genetics is the study of how heritable traits are transmitted from parents to offspring. Humans have long observed that traits tend to be similar in families. It wasnt until the mid-nineteenth century that larger implications of genetic inheritance began to be studied scientifically.

Natural selection

In 1858, Charles Darwin and Alfred Russell Wallace jointly announced their theory of natural selection. According to Darwins observations, in nearly all populations individuals tend to produce far more offspring than are needed to replace the parents. If every individual born were to live and reproduce still more offspring, the population would collapse. Overpopulation leads to competition for resources.

Darwin observed that it is very rare for any two individuals to be exactly alike. He reasoned that these natural variations among individuals lead to natural selection. Individuals born with variations that confer an advantage in obtaining resources or mates have greater chances of reproducing offspring who would inherit the favorable variations. Individuals with different variations might be less likely to reproduce.

Darwin was convinced that natural selection explained how natural variations could lead to new traits in a population, or even new species. While he had observed the variations existent in every population, he was unable to explain how those variations came about. Darwin was unaware of the work being done by a quiet monk named Gregor Mendel.

Inheritance of traits

In 1866, Gregor Mendel published the results of years of experimentation in breeding pea plants. He showed that both parents must pass discrete physical factors which transmit information about their traits to their offspring at conception. An individual inherits one such unit for a trait from each parent. Mendel's principle of dominance explained that most traits are not a blend of the fathers traits and those of the mother as was commonly thought. Instead, when an offspring inherits a factor for opposing forms of the same trait, the dominant form of that trait will be apparent in that individual. The factor for the recessive trait, while not apparent, is still part of the individuals genetic makeup and may be passed to offspring.

Mendels experiments demonstrated that when sex cells are formed, the factors for each trait that an individual inherits from its parents are separated into different sex cells. When the sex cells unite at conception the resulting offspring will have at least two factors (alleles) for each trait. One inherited factor from the mother and one from the father. Mendel used the laws of probability to demonstrate that when the sex cells are formed, it is a matter of chance as to which factor for a given trait is incorporated into a particular sperm or egg.

We now know that simple dominance does not explain all traits. In cases of co-dominance, both forms of the trait are equally expressed. Incomplete dominance results in a blending of traits. In cases of multiple alleles, there are more than just two possible ways a given gene can be expressed. We also now know that most expressed traits, such as the many variations in human skin color, are influenced by many genes all acting on the same apparent trait. In addition, each gene that acts on the trait may have multiple alleles. Environmental factors can also interact with genetic information to supply even more variation. Thus sexual reproduction is the biggest contributor to genetic variation among individuals of a species.

Twentieth-century scientists came to understand that combining the ideas of genetics and natural selection could lead to enormous strides in understanding the variety of organisms that inhabit our earth.

Mutation

Scientists realized that the molecular makeup of genes must include a way for genetic information to be copied efficiently. Each cell of a living organism requires instructions on how and when to build the proteins that are the basic building blocks of body structures and the workhorses responsible for every chemical reaction necessary for life. In 1958, when James Watson and Francis Crick described the structure of the DNA molecule, this chemical structure explained how cells use the information from the DNA stored in the cells nucleus to build proteins. Each time cells divide to form new cells, this vast chemical library must be copied so that the daughter cells have the information required to function. Inevitably, each time the DNA is copied, there are minute changes. Most such changes are caught and repaired immediately. However, if the alteration is not repaired the change may result in an altered protein. Altered proteins may not function normally. Genetic disorders are conditions that result when malfunctioning proteins adversely affect the organism. [Gallery: Images of DNA Structures]

In very rare cases the altered protein may function better than the original or result in a trait that confers a survival advantage. Such beneficial mutations are one source of genetic variation.

Gene flow

Another source of genetic variation is gene flow, the introduction of new alleles to a population. Commonly, this is due to simple migration. New individuals of the same species enter a population. Environmental conditions in their previous home may have favored different forms of traits, for example, lighter colored fur. Alleles for these traits would be different from the alleles present in the host population. When the newcomers interbreed with the host population, they introduce new forms of the genes responsible for traits. Favorable alleles may spread through the population. [Countdown: Genetics by the Numbers 10 Tantalizing Tales]

Genetic drift

Genetic drift is a change in allele frequency that is random rather than being driven by selection pressures. Remember from Mendel that alleles are sorted randomly into sex cells. It could just happen that both parents contribute the same allele for a given trait to all of their offspring. When the offspring reproduce they can only transmit the one form of the trait that they inherited from their parents. Genetic drift can cause large changes in a population in only a few generations especially if the population is very small. Genetic drift tends to reduce genetic variation in a population. In a population without genetic diversity there is a greater chance that environmental change may decimate the population or drive it to extinction.

Mary Bagley, LiveScience Contributor

Further reading:

Read more:
Genetics: The Study of Heredity | Live Science

Read More...

Darwin and Genetics | Genetics

Friday, April 24th, 2020

Abstract

Darwin's theory of natural selection lacked an adequate account of inheritance, making it logically incomplete. We review the interaction between evolution and genetics, showing how, unlike Mendel, Darwin's lack of a model of the mechanism of inheritance left him unable to interpret his own data that showed Mendelian ratios, even though he shared with Mendel a more mathematical and probabilistic outlook than most biologists of his time. Darwin's own pangenesis model provided a mechanism for generating ample variability on which selection could act. It involved, however, the inheritance of characters acquired during an organism's life, which Darwin himself knew could not explain some evolutionary situations. Once the particulate basis of genetics was understood, it was seen to allow variation to be passed intact to new generations, and evolution could then be understood as a process of changes in the frequencies of stable variants. Evolutionary genetics subsequently developed as a central part of biology. Darwinian principles now play a greater role in biology than ever before, which we illustrate with some examples of studies of natural selection that use DNA sequence data and with some recent advances in answering questions first asked by Darwin.

The power of Selection, whether exercised by man or brought into play under nature through the struggle for existence and the consequent survival of the fittest, absolutely depends on the variability of organic beings. Without variability, nothing can be effected; slight individual differences, however, suffice for the work, and are probably the chief or sole means in the production of new species. Charles Darwin (1868)

CHARLES Darwin was the first person to appreciate clearly that evolution depends on the existence of heritable variability within a species to generate the differences between ancestral and descendant populations. The development of Darwin's thoughts on the nature and causes of evolution is clearly documented in his transmutation notebooks of 18361838 (Barrett et al. 1987). Once he had decided that species originated by descent with modification, Darwin quickly realized the need to find a mechanism for accomplishing the changes involved. In formulating the idea of natural selection, he was greatly influenced by the experience of breeders in artificially selecting populations of domestic animals and plants. Chapter 1 of The Origin of Species (Darwin 1859) is famously devoted to documenting the existence of variability in these populations and the effectiveness of artificial selection:

The key is man's power of cumulative selection: nature gives successive variations; man adds them up in certain directions useful to himself (Darwin 1859, p. 30).

It was only a short step to applying this observation to selection in nature:

Can it, then, be thought improbable, seeing that variations useful to man have undoubtedly occurred, that other variations useful in some way to each being in the great and complex battle of life, should sometimes occur in the course of thousands of generations? This preservation of favourable variations and the rejection of injurious variations, I call Natural Selection (Darwin 1859, pp. 8182).

Most of the books and papers that Darwin published after The Origin of Species were devoted to describing how a vast range of biological phenomenafrom the sexual systems of plants to human anatomy and behaviorcould be interpreted in terms of evolution by natural selection or by the special form of natural selection represented by sexual selection. Surprisingly (at least from today's perspective), many biologists were, for a long time, far from convinced that natural selection was the predominant guiding force in evolution. This continued into the 1920s. In the Introduction to Volume 1 of his treatise on evolutionary genetics, Sewall Wright noted:

Along with the universal acceptance by biologists of evolution as a fact, there came to be increasing dissatisfaction, during the latter part of the nineteenth century, with natural selection as the master theory of causation (Wright 1968, pp.78).

Prominent early geneticists such as William Bateson, Hugo de Vries, and Richard Goldschmidt were notorious skeptics about natural selection and the evolutionary role of the small individual differences relied on by Darwin, emphasizing instead the role of mutations with large and manifold effects (Provine 1971). Many naturalists and paleontologists held what now seem to us to be semi-mystical theories, such as internal drives to improvement or perfection; many of them espoused Lamarckian views up until the 1930s (in France and in the Soviet Union and its satellites, Lamarckism persisted well into the 1960s). In his classic history of modern science, The Edge of Objectivity, Charles Coulston Gillispie quotes the leading historian of biology in 1929, Erik Nordenskiold, as stating that the proposition that natural selection does not operate in the form imagined by Darwin must certainly be taken as proved (Gillispie 1960, p. 320). The book Evolution in the Light of Modern Knowledge, a compendium of essays by 13 leading British biologists, published by Blackie and Son in 1925 to provide (according to the publisher's note) an authoritative statement about the doctrine of evolution...after the general upheaval of fundamental theories in the past 20 years, has no index reference to natural selection. This contrasts with 3253 articles mentioning natural selection and evolution in 2008 in the Web of Science database. For a detailed discussion of anti-Darwinian evolutionary ideas, see Bowler (1983) and Gayon (1998).

Why was there such skepticism toward natural selection, and why have things changed so much? One reason was the lack during Darwin's lifetime of direct evidence for natural selection. This started to change in the late 19th and early 20th centuries through the work of Bumpus (1899) in the United States, and Weldon (1895, 1901) and his student Di Cesnola (1907) in Europe. These scientists initiated the field now known as ecological genetics, and we now have literally thousands of examples where field naturalists have demonstrated the operation of natural selection in the wild on both discrete polymorphisms and quantitative traits (Kingsolver et al. 2001; Bell 2008; Leimu and Fischer 2008).

The other major factor, of course, was the fact that Darwin failed to arrive at an understanding of the mechanism of inheritance, despite realizing its importance and devoting a vast effort to assembling evidence in his Variation of Animals and Plants Under Domestication (Darwin 1868). Unfortunately, he was unaware of Mendel's work, despite its publication 2 years earlier (Mendel 1866). Mendel's work has now, of course, permanently revolutionized our understanding of heredity, and his tragic failure to obtain recognition in his lifetime is a well-known story. It is less well known that Mendel was well aware of the importance for evolution of understanding genetics:

This seems to be the one correct way of finally reaching a solution to a question whose significance for the evolutionary history of organic forms cannot be underestimated (Mendel 1866, p. 2).

Sadly, even if Mendel had lived to see the rediscovery of his work, he probably would not have had the satisfaction of seeing it contribute to evolutionary understanding because, even after genetics had begun its rapid development in the early decades of the 20th century, evolutionary biologists initially failed to understand how to incorporate genetics into their work. We will outline these failures to achieve a synthesis later, but first consider Darwin's efforts to understand inheritance and how his approach fell short of Mendel's.

Mendel's ability to solve the most difficult problem in 19th century biology after the mechanism of evolution rests on his use of a then-unique approach: combining rigorous genetic experiments with quantitative, probabilistic predictions about their expected outcomes: in other words, using biological data to test a quantitative hypothesis. It is a triumph of productive theoretical reasoning that Mendel proposed his particulate inheritance hypothesis well before a proper understanding of the cellular basis of sexual reproduction was achieved by either animal or plant biologists (Farley 1982).

This achievement eluded Darwin, the other greatest mind in 19th century biology, although he came close to seeing the same phenomena as Mendel did and frequently looked at data in a quantitative manner (Howard 2009). Darwin repeatedly referred to the phenomenon of reversion to ancestral types in Variation of Animals and Plants Under Domestication (Darwin 1868). He also compiled examples of the transmission of traits down several generations of pedigrees and obtained help from the mathematical physicist Sir George Stokes to show that these cases are unlikely to be due to chance, one of the first examples of a test of statistical significance in biology (Darwin 1868, chap. 12).

Ironically, Darwin analyzed data from his own crossing experiments on distyly in Primula species (summarized in Darwin 1877, chap. 5), which gave what we can now see as clear evidence for Mendelian ratios (see also Bulmer 2003, p.112, and Howard 2009). In distylous species (Figure 1), the long-styled morphs (L) are now known to be homozygotes ss for the alleles at several loci in a supergene controlling style length, stamen position, pollen and stigma placement, morphology, and incompatibility, whereas the short-styled morph (S) is heterozygous Ss. The only matings that invariably succeed are L S and S L (Darwin called these legitimate pollinations), and these give a 1:1 ratio of L and S plants (Table 1). It is occasionally possible to obtain seeds by self-fertilization, in which case L plants produce only L offspring (Table 1). Darwin stated:

Distyly in primroses. (A) Long-styled (pin) and short-styled (thrum) flowers of Primula vulgaris. (B) Vertically sectioned flowers, with the compatible pollinations indicated. (Pollen from high anthers is compatible with stigmas of long-styled plants, and pollen from low anthers is compatible with stigmas of short-styled plants, while the other two types of pollinations are incompatible.)

Darwin's results for progeny of long- and short-styled Primula crossed with the same morph

From the long-styled form, always fertilised with its own-form pollen, I raised in the first generation three long-styled plants, from their seed 53 long-styled grandchildren, from their seed 4 long-styled great-grandchildren, from their seed 20 long-styled great-great-grandchildren, and lastly, from their seed 8 long-styled and 2 short-styled great-great-great-grandchildren. altogether 162 plants were raised, 156 long-styled and 6 short-styled (Darwin 1877, pp. 228229).

The few short-styled plants in the final generation were presumably contaminants (Darwin's experiments were remarkably free from them). Self-pollination of S plants should generate a 3:1 ratio, as Darwin found (see Table 1; none of the ratios deviates significantly from the expected ratio). He remarked:

I raised at first from a short-styled [P. sinensis] plant fertilised with its own-form pollen one long-styled and seven short-styled illegitimate seedlings . Dr. Hildebrand raised fourteen plants, of which eleven were short-styled and three long-styled (Darwin 1877, p. 216).

Darwin failed to understand the significance of these results because he had no model of particulate inheritance that could be applied to genetic data. Indeed, Darwin appears to have maintained a belief in the predominance of blending inheritance, as did nearly all of his contemporaries. As Fisher pointed out in chapter 1 of The Genetical Theory of Natural Selection (Fisher 1930), there are few explicit statements on this in Darwin's published works, although they appear in some of his unpublished notes and essays. In addition, chapter 15 of Variation of Animals and Plants Under Domestication (Darwin 1868) starts with a section On Crossing as a Cause of Uniformity of Character, which implicitly assumes that crossing leads to blending. It is unclear, however, to what extent he thought that an offspring was a product of the complete fusion of the genetic contributions of its parents (Bulmer 2003, chap. 4).

Blending inheritance leads to a difficulty that was forcefully pointed out by Fleeming Jenkin (Jenkin 1867), the professor of engineering at the University of Edinburgh (the building next to ours is, somewhat unfortunately perhaps, named after him). Under blending inheritance, variation decays rapidly because the genotypes of the offspring of a cross are all the same and are intermediate between those of the two parents. With random mating, the genetic variance of a quantitative trait then decays by a factor of one-half each generation (Fisher 1930, p. 4). Acceptance of blending inheritance clearly raises doubts about the ability of either natural or artificial selection to make permanent changes in a population. In the sixth edition of The Origin of Species, published in 1872, Darwin reacted to Jenkin as follows:

Nevertheless, until reading an able and valuable article in the North British Review (1867), I did not appreciate how rarely single variations, whether slight or strongly-marked, could be perpetuated (Darwin 1859, pp. 111112).

Since heritable variability is required for selection to be effective, and Darwin's survey of the results of artificial selection had convinced him that there is enough variation for it to be effective, Darwin sought a way of generating an abundance of such variation. This was provided by his theory of pangenesis, according to which variations experienced by the individual during its lifetime are transmitted to the germ cells by hypothetical gemmules (Darwin 1868, chap. 27). This is an hypothesis of the inheritance of acquired characters, which Darwin accepted as an experimentally established fact (there is an extensive discussion on the transmission of mutilations in Darwin 1868, chap. 12).

However, Darwin was clearly not quite sure about this. For example, he mentioned that the circumcision of male infants has not led to a loss of the foreskin in the Jewish community (Darwin 1868, Vol. 1, p. 558). He also noted that there are some instances of evolution that cannot be explained by this hypothesis, notably the adaptive characteristics of the sterile castes of social insects:

For no amount of exercise, or habit, or volition, in the utterly sterile members of a community could possibly have affected the structure or instincts of the fertile members, which alone leave descendants. I am surprised that no one has advanced this demonstrative case of neuter insects, against the well-known doctrine of Lamarck (Darwin 1859, p. 242).

Darwin's use of this natural case of sib selection to refute Lamarckism anticipates later uses of the same reasoning, which reached a peak of perfection in the Lederbergs' experiments on replica plating in Escherichia coli (Lederberg and Lederberg 1952).

Unlike Darwin, who regarded the inheritance of acquired characters largely as a source of variation on which selection could act, the 20th century advocates of Lamarckian inheritance viewed it as an alternative explanation of adaptive evolution. As was brilliantly laid out by Fisher in chapter 1 of The Genetical Theory of Natural Selection, and as is no doubt familiar to readers of Genetics, all the difficulties posed by blending disappear with Mendelian inheritance: variability within a population is conserved, not lost, when no evolutionary forces are acting, a genetic equivalent to Galileo's law of inertia. The inheritance of acquired characters is therefore not needed for the regeneration of genetic variability.

It is, of course, well known that our knowledge of the physical basis of genes and of their behavior now largely excludes Lamarckian inheritance. However, recent studies have uncovered some situations in which the DNA of certain genome regions is modified during the life of an individual, and these epigenetic marks with functions in developmental control and other processes can sometimes pass via meiotic divisions to descendant generations (e.g., Cubas et al. 1999; Richards 2006; Namekawa et al. 2007; Heijmans et al. 2008; Sidorenko and Chandler 2008). If variants that arise in this way are stably transmitted, then they can be treated as Mendelian variants that can be exploited in evolution. If their inheritance is unstable, as is often the case, they cannot contribute significantly to evolution.

The breakthrough in understanding the nature of variation in quantitative traits (equivalent to Darwin's slight differences, which may be called individual differences; see the epigraph to this article) came in the early years of genetics, starting with experiments with pure lines, whose individuals have virtually identical genotypes. These experiments showed that plentiful phenotypic variation exists among such individuals but is not transmissible to the offspring (Johannsen 1909; Wright 1920), leading to the rejection of Lamarckian inheritance by the genetics community. Furthermore, the variability of quantitative traits (which often show apparent blending in F1 crosses between pure lines) increases in F2 and later generations (Nilsson-Ehle 1909; East 1910), as expected with particulate Mendelian inheritance. Moreover, the factors responsible can be mapped to chromosomal regions and sometimes (with modern methods) to single genes or nucleotide variants (Flint and Mackay 2009). Even initially puzzling cases of very complex patterns of inheritance, such as beaded and truncate wing in Drosophila, were traced to factors linked to chromosomal genes, and the virtual universality of Mendelism was established by the early 1920s (Altenburg and Muller 1920). In contrast to the inheritance of acquired characters, mutations were found to be very rare, stable modifications of genes and to arise independently of whether or not they confer increased fitness in a given environment (Muller 1932).

By the 1920s, it was clear that (contrary to the beliefs of many early geneticists, who emphasized the large effects of dramatic mutations and ignored the evidence for the Mendelian basis of quantitative trait variation), Darwinian evolution by natural selection is, in fact, enabled by Mendelian inheritance: mutations in genes provide the source of new, stable variants on which selection can act. This set the stage for understanding that evolution is fundamentally a process of change in the frequencies of Mendelian variants within populations and species, leading to the development of classical population and quantitative genetics. The fascinating struggle to reach this understanding is ably described by Provine (1971).

The chief post-Darwin component of major importance in modern evolutionary thinking is the idea of genetic drift and, specifically, the possibility that a significant portion of variability and evolution of DNA sequence variants is driven by random fluctuations in the frequencies of variants with little or no effects on fitness (Kimura 1983). Darwin himself had the idea of selective neutrality:

Variations neither useful nor injurious would not be affected by natural selection, and would be left a fluctuating element, as perhaps we see in the species called polymorphic (Darwin 1859, p. 81).

In a surprising turn of events, the concept of selective neutrality has become a cornerstone of modern tests for natural selection, by providing a null hypothesis that can be tested against data on sequence variation and evolution. Evolutionary biology is now mature enough to repay its debt to genetics and indeed is now (together with genetic and molecular genetic approaches) central to work initiated with largely functional genetic motivations, including genome sequencing.

Given some genetic variation in a phenotype of interest, ecological genetic approaches can relate fitnesses to the differences between individuals within a single natural population, sometimes using data on undisturbed individuals (Bell 2008). With more disturbance to the organisms, between-population differences can also be tested for their selective importance by using methods such as reciprocal transplant experiments. Changes in genotype frequencies can be followed over time in such experiments or after perturbing alleles from their natural frequencies. These approaches have firmly documented the action of selection, sometimes on unexpected characters such as the inversion polymorphisms of Drosophila (Wright and Dobzhansky 1946). However, this approach may miss many instances of selection, because even the largest and most sensitive experiments, such as those involving competition between strains of yeast or bacteria, cannot detect selective differences <0.1% in magnitude (Dykhuizen 1990).

At the other extreme of the evolutionary timescale, the comparative approach can be used to relate differences in ecological conditions experienced by different evolving lineages to differences in the outcome of evolution by natural selection (Harvey and Pagel 1991). Darwin was the first biologist to explicitly use the comparative approach for this purpose. This approach is now highly statistical (Felsenstein 2004) and often uses sequence-based phylogenies, which have the advantage of being much less susceptible to the action of natural selection in causing variation in the rate and direction of character change than the morphological traits formerly used in phylogenetic analysis. Even without modern methods, Darwin used the comparative method to good effect in his work on plant mating system evolution, for example, in his review of the literature to show that inbreeding plants have smaller flowers and are generally less attractive to pollinators compared with outcrossing ones (Darwin 1876), a finding that has held up in more comprehensive modern studies and that tells us that attracting pollinators consumes resources (e.g., Ornduff 1969). The comparative approach is, however, incapable of providing estimates of the intensity of selection involved in causing the changes observed.

Modern DNA sequencing technology provides population geneticists with the ability to study the extent to which selection acts on variants across the genome, as opposed to mutation and random genetic drift. After several decades of using the ecological genetic and comparative approaches to detect selection in nature on visible or physiological traits, biologists can now test for the selective effects of specific genetic differences between individuals without needing to know their phenotypic effects. For these tests, neutrality provides an essential null hypothesis. With our newly acquired ability to apply statistical population genetics methods to the analysis of patterns of within-species variation and between-species divergence in large, genomewide data sets, extremely weak pressures of selection, well below the resolution of experimental methods, can be detected and measured. Many of the approaches currently being used are closely based on the classical work of Fisher, Kimura, and Wright on the behavior of variants subject to mutation, selection, and genetic drift, which are summarized in Kimura's (1983) book, The Neutral Theory of Molecular Evolution. These methods are often extremely computationally intensive, especially when complications like recent changes in population size are taken into account.

With the increasing availability of large data sets on DNA sequence variation across the genomes of humans and Drosophila melanogaster, we are getting close to answering questions such as: What is the distribution of selection coefficients for newly arising deleterious amino acid mutations? What fraction of amino acid variants distinguishing related species are fixed by natural selection, as opposed to genetic drift acting on neutral or slightly deleterious variants? To what extent are variants at synonymous coding sites and noncoding sites subject to selection, and how strong is this selection?

The results are sometimes quite startling. It has been fairly conclusively established, for example, that a typical human being is heterozygous for several hundred amino acid mutations, most of which have only very small effects on fitness (of the order of 103) (Boyko et al. 2008), that 50% of amino acid variants distinguishing related Drosophila species have been fixed by selection (Sella et al. 2009), and that more noncoding sites than coding sites in both humans and Drosophila can mutate to selectively deleterious alternatives that are rapidly removed by selection (Encode Project Consortium 2007; Haag-Liautard et al. 2007).

In addition to these direct tests of selection on variants, we can also use information on neutral or nearly neutral variants that are not themselves under selection to make inferences about selection at linked sites in the genome. One example is the detection of selective sweeps caused by the recent spread of selectively favorable mutations. The spread of an advantageous allele can quickly lead to very low variability in the gene affected, and closely linked regions may also have reduced diversity as a result of hitchhiking through the population of the segment of the chromosome that contained the original beneficial mutation (Maynard Smith and Haigh 1974). These effects on linked neutral or nearly neutral variants can be used in statistical tests for the action of natural selection. This has enabled geneticists to detect and estimate the strength of selection acting on genes such as drug resistance genes in the human malaria parasite by using the variability of microsatellite markers (e.g., Nash et al. 2005) to detect numerous examples of recent adaptations in human populations from their effects on patterns of variation at linked SNPs (e.g., Currat et al. 2002; Sabeti et al. 2002; Williamson et al. 2007; Akey 2009) and to search for genes involved in responses to artificial selection (Walsh 2008).

Conversely, high variability in a region can betray the action of natural selection acting in such a way that different alleles are maintained as polymorphisms for a long period by balancing selection; this divides the population into two or more compartments, between which neutral differences can accumulate at sites that are closely linked to the targets of selection, where recombination is ineffective at preventing differentiation between the compartments (Hudson 1990; Nordborg 1997). A well-known example is the human MHC region, in which not only are there many polymorphic amino acids in exon 2, which encodes most of the peptide-binding residues of the human mature MHC proteins, but after there are extraordinarily numerous polymorphic variants of synonymous and intron sites, compared with other loci in the same populations (Raymond et al. 2005). Similarly, frequency-dependent selection has clearly maintained sequence polymorphism for long evolutionary times at plant and fungal self-incompatibility loci, whose sequences are highly diverged (e.g., Vekemans and Slatkin 1994; Richman et al. 1996; May et al. 1999).

Not only can selection within single populations be studied by molecular evolutionary approaches, but between-population differences due to local adaptation can also be revealed by increased divergence at sites linked to the targets of selection (Beaumont and Balding 2004). Indeed, scans of human and other species' genomes for sequences that are more differentiated than most genes are a major way of discovering candidates for genes that are currently under selection (Akey 2009).

Another way in which modern evolutionary studies have contributed to genetics, as opposed to genetics contributing to evolutionary biology, is that an interest in quantifying the extent of genetic variation [initially motivated by a debate about whether variation within species is largely composed of recent mutations or includes a considerable proportion of variants maintained by balancing selection (Dobzhansky 1955; Lewontin 1974)] ultimately led to the discovery of vast numbers of DNA sequence variants that can be used as genetic markers for mapping (although these data did not in themselves settle the debate about whether selection maintains variation). The existence of abundant markers was predicted long ago:

It would accordingly be desirable, in the case of man, to make an extensive and thorough-going search for as many factors as possible that could be usedas identifiers. They should, preferably, involve character differences that are (1) of common occurrence, (2) identifiable with certainty, (3) heritable in a simple Mendelian fashion. It seems reasonable to suppose that in a species so heterozygous there must really be innumerable such factors present. It does seem clear that in the more tractable organisms, such as the domesticated and laboratory races of animals and plants, character analysis by means of linkage studies with identifying factors will come into more general use (Altenburg and Muller 1920).

In some species, naturally occurring markers can now be obtained so densely that new approaches are needed for genetic mapping because there is a very low chance of a crossover event between the closest markers (e.g., Churchill et al. 2004; Van Os et al. 2005; Flibotte et al. 2009). The possibility of obtaining large numbers of genetic markers has produced renewed progress in mapping genes affecting quantitative characters, and new approaches are being developed for such studies, including association mapping that makes use of the population genetics concept of linkage disequilibrium (associations between the allelic states of different loci or sites in a sequence; see Slatkin 2008 for an overview). The study of the population genetics of multi-locus systems once appeared to be an esoteric field, remote from empirical data, which contributed to the reputation of theoretical population genetics for dryness and irrelevance to biology. Nevertheless, very important principles were developed that are now widely used by other geneticists, including ways to measure linkage disequilibrium [also now used to estimate recombination rates in genomes by using samples of sequences from populations (Myers et al. 2005)] and the concept that selection acting on a given sequence variant or allele has effects on closely linked variants (see above).

The kinds of approaches just mentioned are no longer restricted to humans and the genetics model organisms of most interest for functional molecular genetic work. One well-established use of markers is to infer the mating systems of populations in the wild (Ritland 1990). Darwin anticipated this when he used phenotypic differences, including flower colors, that he clearly assumed to be inherited, to infer the parentage of seeds:

Altogether 233 plants were raised, of which 155 were mongrelised in the plainest manner, and of the remaining 78 not half were absolutely pure. I repeated the experiment by planting near together two varieties of cabbage with purple-green and white-green lacinated leaves; and of the 325 seedlings raised from the purple-green variety, 165 had white-green and 160 purple-green leaves. Of the 466 seedlings raised from the white-green variety, 220 had purple-green and 246 white-green leaves. These cases show how largely pollen from a neighbouring variety of the cabbage effaces the action of the plant's own pollen (Darwin 1876, p. 393).

It is now becoming possible to conduct fine-scale genetic mapping studies in nonmodel species, including those of applied interest, such as domesticated animals and plants and their pathogens, where QTL mapping is being aided by the abundant supply of new markers. Genetic mapping gives promise of testing hypotheses such as the close linkage of genes involved in heterostyly in Primula and other plant species (Li et al. 2007; Labonne et al. 2009) and mimicry in butterflies (Baxter et al. 2008), examples of problems that interested Darwin. Gene mapping is also important in modern work on the genetics of speciation, which is at last identifying genes involved in reproductive isolation between closely related species and is uncovering evidence for the DobzhankyMuller hypothesis that natural selection is important in causing genetic differences between populations that lower the survival or fertility of F1 or F2 hybrids, as a result of deleterious epistatic interactions between alleles derived from the two populations (e.g., Barbash et al. 2003; Presgraves et al. 2003). As is well known, Darwin himself found the evolution of reproductive isolation puzzling:

The importance of the fact that hybrids are very generally sterile, has, I think, been much underrated by some late writers. On the theory of natural selection the case is especially important as the sterility of hybrids could not possibly be of any advantage to them, and therefore could not have been acquired by the continued preservation of successive profitable degrees of sterility (Darwin 1859, p. 245).

However, the title The Origin of Species did not refer to this central puzzle concerning speciation, but rather to the evolution of adaptations and character differences; before the rise of genetics, it would have been virtually impossible for a correct interpretation of reproductive isolation to have been developed.

Another long-debated topic for which genetic marker availability should help our understanding is the question of the genetic basis of inbreeding depression and of heterosis. Although the deleterious effects of inbreeding were known to some earlier biologists, Darwin was the first to study the phenomenon thoroughly, because he realized that it provides an explanation for the existence of the elaborate adaptations of plants to avoid inbreeding. Darwin's book The Effects of Cross and Self Fertilization in the Vegetable Kingdom described his own experiments comparing progeny produced by self- and cross-fertilization in 57 plant species, and his summary of the main results anticipated future work that allowed us to measure inbreeding (in modern terms, inbreeding coefficients):

That certain plants, for instance, Viola tricolor, Digitalis purpurea, Sarothamnus scoparius, Cyclamen persicum, etc., which have been naturally cross-fertilised for many or all previous generations, should suffer to an extreme degree from a single act of self-fertilisation is a most surprising fact. Nothing of the kind has been observed in our domestic animals; but then we must remember that the closest possible interbreeding with such animals, that is, between brothers and sisters, cannot be considered as nearly so close a union as that between the pollen and ovules of the same flower. Whether the evil from self-fertilisation goes on increasing during successive generations is not as yet known; but we may infer from my experiments that the increase if any is far from rapid. After plants have been propagated by self-fertilisation for several generations, a single cross with a fresh stock restores their pristine vigour; and we have a strictly analogous result with our domestic animals (Darwin 1876, p. 438).

As pointed out by Fisher in his Design of Experiments (Fisher 1935, chap. 3), Darwin used paired contrasts of the performance of an inbred and an outbred plant grown in the same pot, a method that is widely used in modern biological statistics. Darwin's insight into the utility of this approach was spoiled by the reanalysis of his data conducted by his cousin, Francis Galton, a supposedly more expert statistician.

Just as with his theory of sexual selection to explain male/female dimorphism (Darwin 1871), which was largely neglected until the 1970s, the individual selective advantage to outcrossing arising from inbreeding depression postulated by Darwin was rejected by leading 20th century thinkers on plant evolution, such as C. D. Darlington and G. L. Stebbins, in favor of group selection hypotheses of advantages of increased variability to the population or species. The role of inbreeding depression in the evolution of mating systems is, however, now well established (Barrett 2002).

Although Darwin was unable to provide a satisfactory interpretation of his observations, inbreeding depression is now well known to be a genetic phenomenon, and hybrid vigor (heterosis) is widely used in agriculture. It is also well known that the genetic basis of these phenomena is difficult to ascertain and that this may impede efforts to make the best use of heterosis. There is no doubt that rare, deleterious mutations play an important role (Charlesworth and Charlesworth 1999): inbreeding, by producing homozygotes for such mutations, reduces survival and fertility because a large proportion of deleterious mutations are recessive, or partially so, and cause only slight harm when heterozygous, as was first clearly proposed by D. F. Jones (Jones 1917). Heterosis is also explicable on this basis because different inbred strains will be homozygous for different deleterious mutations, and different populations of a species in nature will differ similarly at some proportion of their genes, particularly if the populations are highly isolated (Ingvarsson et al. 2000; Escobar et al. 2008). It is still much less clear whether loci with overdominant alleles (alleles showing heterozygote advantage) also contribute any major part of inbreeding depression or heterosis, although it is intuitively easy to understand that, if such loci are common, these effects would be produced. Identification of the genetic factors involved in inbreeding depression or heterosis by the fine-scale mapping methods referred to above should help to answer these questions.

The examples that we have outlined here show the value of the ongoing interaction between genetics and the study of evolution. From being a major headache for early supporters of evolution, genetics paved the way for models of evolution based on the known properties of inheritance, so that the constraints experienced by genes and genomes in evolution were correctly incorporated into quantitative models, and new possibilities, unknown to Darwin, were discovered.

Evolutionary genetics is inherently interdisciplinary, fruitfully combining models (often mathematical and often stochastic, given the nature of genetics) with empirical data. This intellectual tradition, now 100 years old, deserves celebration along with Darwin's anniversaries. We hope that we have shown that evolution is more central to modern biological research than ever before and that this productive collaboration with genetics can be predicted to yield many further pure and applied scientific riches in the next hundred years. For this to happen, the need for a broad-enough education must be met. Biologists and doctors will need to understand genetics, and even some population genetics concepts, at least enough to collaborate with people with expertise in relevant quantitative methods. Mathematical ideas need to be demystified, as far as possible, so that biologists using phylogenetic and genetic marker or diversity analyses know what lies behind the computer programs that they use, an understanding without which the numbers that come out may lead to wrong conclusions. We need to regain a respect for the usefulness of statistics throughout biology and use it to test our ideas, as Darwin started to do. The same applies to theoretical modeling directed toward testable hypotheses, of which the idea of natural selection is still an excellent example, even though it has been extended to a far wider realm of biology than Darwin initially proposed and has given us many valuable tools at the interface between genetics and evolution. Darwin himself was interested in the functioning of organisms, not just in their morphology and relationships and the history of life, and he would surely have been delighted to see where his ideas have so far led us and how they have continued to be central within biology. In Dobzhansky's famous words:

Nothing in biology makes sense except in the light of evolution (Dobzhansky 1973).

We thank Adam Wilkins and two reviewers for their helpful suggestions for improving the manuscript.

Akey, J. M., 2009 Constructing genomic maps of positive selection in humans: Where do we go from here? Genome Res. 19: 711722.

Altenburg, E., and H. J. Muller, 1920 The genetic basis of truncate wing: an inconstant and modifiable character in Drosophila. Genetics 5: 159.

Barbash, D. A., D. F. Siino, A. M. Tarone and J. Roote, 2003 A rapidly evolving MYB-related protein causes species isolation in Drosophila. Proc. Natl. Acad. Sci. USA 6: 53025307.

Barrett, P. H., P. J. Gautrey, S. Herber, D. Kohn and S. Smith, 1987 Charles Darwin's Notebooks, 18361844. Cornell University Press, Ithaca, NY.

Barrett, S. C. H., 2002 The evolution of plant sexual diversity. Nat. Rev. Genet. 3: 274284.

Baxter, S., R. Papa, N. Chamberlain, S. Humphray, M. Joron et al., 2008 Convergent evolution in the genetic basis of Muellerian mimicry in Heliconius butterflies. Genetics 180: 15671577.

Beaumont, M. A., and D. J. Balding, 2004 Identifying adaptive genetic divergence among populations from genome scans. Mol. Ecol. 13: 969980.

Bell, G., 2008 Selection, Ed. 2. Oxford University Press, Oxford.

Bowler, P. J., 1983 Eclipse of Darwinism: Anti-Darwinian Evolution Theories in the Decade Around 1900. Johns Hopkins University Press, Baltimore.

Boyko, A., S. H. Williamson, A. R. Indap, J. D. Degenhardt, R. D. Hernandez, et al., 2008 Assessing the evolutionary impact of amino-acid mutations in the human genome. PLOS Genet. 5: e1000083.

Bulmer, M. G., 2003 Francis Galton. Johns Hopkins University Press, Baltimore.

Bumpus, H. C., 1899 The elimination of the unfit as illustrated by the introduced sparrow, Passer domesticus. Biol. Lectures Woods Hole Marine Biol. Station 6: 209226.

Charlesworth, B., and D. Charlesworth, 1999 The genetic basis of inbreeding depression. Genet. Res. 74: 329340.

Churchill, G. A., D. C. Airey, H. Allayee, J. M. Angel, A. D. Attie et al., 2004 The Collaborative Cross, a community resource for the genetic analysis of complex traits. Nat. Genet. 36: 11331137.

Cubas, P., C. Vincent and E. S. Coen, 1999 An epigenetic mutation responsible for natural variation in floral asymmetry. Nature 401: 157161.

Currat, M., G. Trabuchet, D. Rees, P. Perrin, R. M. Harding et al., 2002 Molecular analysis of the beta-globin gene cluster in the Niokholo Mandenka population reveals a recent origin of the beta(S) Senegal mutation. Am. J. Hum. Genet. 70: 207223.

Darwin, C. R., 1859 The Origin of Species. John Murray, London.

Darwin, C. R., 1868 Variation of Animals and Plants Under Domestication. John Murray, London.

Darwin, C. R., 1871 The Descent of Man and Selection in Relation to Sex. John Murray, London.

Darwin, C. R., 1876 The Effects of Cross and Self Fertilization in the Vegetable Kingdom. John Murray, London.

Darwin, C. R., 1877 The Different Forms of Flowers on Plants of the Same Species. John Murray, London.

Di Cesnola, A. P., 1907 A first study of natural selection in Helix arbustorum (Helicogena). Biometrika 5: 387399.

Dobzhansky, T., 1955 A review of some fundamental concepts and problems of population genetics. Cold Spring Harbor Symp. Quant. Biol. 20: 115.

Dobzhansky, T., 1973 Nothing in biology makes sense except in the light of evolution. Am. Biol. Teach. 35: 125129.

Dykhuizen, D. E., 1990 Experimental studies of natural selection in bacteria. Annu. Rev. Ecol. Evol. Syst. 21: 373398.

East, E. M., 1910 A Mendelian interpretation of variation that is apparently continuous. Am. Nat. 44: 6582.

Encode Project Consortium, 2007 Identification and analysis of functional elements in 1% of the human genome by the ENCODE pilot project. Nature 447: 799816.

Escobar, J., A. Nicot and P. David, 2008 The different sources of variation in inbreeding depression, heterosis and outbreeding depression in a metapopulation of Physa acuta. Genetics 180: 15931608.

Farley, J., 1982 Gametes and Spores. Johns Hopkins Press, Baltimore.

Felsenstein, J., 2004 Inferring Phylogenies. Sinauer, Sunderland, MA.

Fisher, R. A., 1930 The Genetical Theory of Natural Selection. Clarendon Press, Oxford.

Fisher, R. A., 1935 The Design of Experiments. Oliver & Boyd, Edinburgh.

Flibotte, S., M. Edgley, J. Maydan, J. Taylor, R. Zapf et al., 2009 Rapid high resolution single nucleotide polymorphism-comparative genome hybridization mapping in Caenorhabditis elegans. Genetics 181: 3337.

Flint, J., and T. F. C. Mackay, 2009 Genetic architecture of quantitative traits in mice, flies, and humans. Genome Res. 19: 723733.

Gayon, J., 1998 Darwinism's Struggle for Survival: Heredity and the Hypothesis of Natural Selection. Cambridge University Press, Cambridge, UK.

Gillispie, C. C., 1960 The Edge of Objectivity. Princeton University Press, Princeton, NJ.

Haag-Liautard, C., M. Dorris, X. Maside, S. Macaskill, D. L. Halligan et al., 2007 Direct estimation of per nucleotide and genomic deleterious mutation rates in Drosophila. Nature 445: 8285.

Harvey, P. H., and M. Pagel, 1991 The Comparative Method in Evolutionary Biology. Oxford University Press, Oxford.

Heijmans, B. T., E. W. Tobi, A. D. Stein, H. Putter, G. J. Blauw et al., 2008 Persistent epigenetic differences associated with prenatal exposure to famine in humans. Proc. Natl. Acad. Sci. USA 6: 1704617049.

Link:
Darwin and Genetics | Genetics

Read More...

On National DNA Day, scientists are trying to take the colonialism out of genetics – Massive Science

Friday, April 24th, 2020

Scientists are trying to tackle the lack of diversity seen in genomics research, but even ambitious efforts, like the NIHs All of Us program, often fall short, especially when it comes to the inclusion of Indigenous communities. This is one of the reasons why the Decolonize DNA Day conference is taking place on April 24th, one day before the National DNA Day.

Traditionally, National DNA Day is an annual celebration of the discovery of DNA's double helix structure (1953) and the completion of the Human Genome Project (2003).

I was having conversations with colleagues on what would it mean to decolonize DNA, says Krystal Tsosie, an Indigenous (Din/Navajo) PhD student at Vanderbilt University. As an Indigenous academic, we always talk about what it means to Indigenize and re-Indigenize different disciplines of academia that have been historically more white-centred or white-dominated... and what it would mean to remove the colonial lens.

In collaboration with Latrice Landry and Jerome de Groot, Tsosie co-organized the Decolonize DNA Day Twitter conference to help re-frame narratives around DNA. Each speaker will have an hour to tweet out their "talk" and lead conversations on various topics, including how DNA ancestry testing fuels anti-Indigeneity and how to utilize emerging technologies to decolonize precision medicine.

There is a divide between people who are doing the science or the academic work, and the people who we want to inform, says Tsosie. Twitter is a great way to bridge that divide.

The Decolonize DNA Day conference is simply one effort to Indigenize genomics. Tsosie is also a co-founder of the Native BioData Consortium, a non-profit organization consisting of researchers and Indigenous members of tribal communities, focused on increasing the understanding of Native American genomic issues.

We dont really see a heavy amount of Indigenous engagement in genetic studies, which then means that as precision medicine advances as a whole [] those innovations are not going to be applied to Indigenous people, says Tsosie. How do we get more Indigenous people engaged?

Some of the answers can be found in a recent Nature Reviews Genetics perspective, penned by Indigenous scientists and communities, including those from the Native BioData Consortium. The piece highlights the actions that genomics researchers can take to address issues of trust, accountability, and equity. Recommended actions include the need for early consultations, developing benefit-sharing agreements, and appropriately crediting community support in any academic publications.

By switching power dynamics, were hoping to get genomic researchers to work with us, instead of against us, says Tsosie.

Read more:
On National DNA Day, scientists are trying to take the colonialism out of genetics - Massive Science

Read More...

Genetic variants linked with onset, progression of POAG – Modern Retina

Friday, April 24th, 2020

Genetic variants that are unrelated to the IOP are associated with a family history of glaucoma and play a role in the onset of primary open-angle glaucoma (POAG). Genetic variants that are related to the IOP are associated with the age at which glaucoma is diagnosed and are associated with disease progression.

What is known about POAG, the most prevalent form of glaucoma, is that increased IOP and myopia are risk factors for damage to the optic nerve in POAG.

Related: Stent offers IOP stability more than three years after surgery

A family history of glaucoma is a major risk factor for development of POAG, in light of which, therefore, genetic factors are thought to be important in the disease pathogenesis and a few genes mutations have been identified as causing POAG, according to Fumihiko Mabuchi, MD, PhD, professor, Department of Ophthalmology, Faculty of Medicine, University of Yamanashi, Kofu, Japan.

Myopia has been shown to be a risk factor for POAG in several studies. However, it can be difficult to diagnose true POAG in myopic patients and controversy exists over whether it is real risk factor.

Myopic optic discs are notoriously difficult to assess, and myopic patients may have visual field defects unrelated to any glaucomatous process.

The prevalence of POAG increases with age, even after compensating for the association between age and IOP.

Related: Preservative-free tafluprost/timolol lowers IOP well, glaucoma study shows

Part of the storyDr. Mabuchi and his and colleagues, recounted that these factors are only part of the story.

According to Dr. Mabuchi and his colleagues, cases of POAG caused by these gene mutations account for several percent of all POAG cases, and most POAG is presumed to be a polygenic disease.

Recent genetic analyses, the investigators explained, have reported genetic variants that predispose patients to development of POAG and the additive effect of these variants on POAG, which are classified as two types.

The first genetics variants are associated with IOP elevation.

Related: Sustained-release implant offers long-term IOP control, preserved visual function

View post:
Genetic variants linked with onset, progression of POAG - Modern Retina

Read More...

Meghan Trainor shows her work in behind the scenes video of "Genetics" performance with Nicole Scherzinger of – LaineyGossip

Friday, April 24th, 2020

Like everyone else, Im consuming everything in sight these days, because what else are we going to do, really; but also like everyone else (I suspect) Im using the time either to catch up on series I heard I should have watched, or to be pulled into whatever Netflix serves me up.

Then the other day, @ceas89 sent us a video on Twitter, and I was enthralled and also ashamed. Enthralled because it was amazing; ashamed because before now, I didnt know. The video is a behind the scenes of Trainors song Genetics, featuring Nicole Scherzinger of the Pussycat Dolls, and before I clicked, I read the tweet, which said, can you please talk about the work happening in this video? So, I played it (skip to 0:30 to really get to the goods):

The song was released in September 2019, but Meghan posted it just a couple of weeks ago. And, as @ceas89 predicted, I watched it over and over, because its incredible. And Ill confess, I didnt know this work was gonna be here. I love that theyre recording in an apartment or bungalow of some kind weve seen this kind of video before, but often in a blank, black studio. Somehow I felt more welcomed into the room in this video, which is a cool trick.

But its the dynamics of this that I love most.

Meghan starts off complimenting Nicole, in a way that might seem self-deprecating. I have the voice to record, I just dont have the dance moves to dance. Initially I was sort of frustrated with that, because I felt like the Meghan Trainor doesnt have the right body for the music industry narrative is boring, played out, and also, its patently not true. Ask Lizzo.

But as the rest of the video unfolds, its clear that what shes doing is actually trying to build Nicole up. Because its Meghans song, and she knows how it needs to go, while Nicole is anxiously trying to do her best, but is aware, maybe, that shes not *quite* nailing it. We dont see the takes where she doesnt get her vocals to where they need to be, but her affect and the way she looks at Meghan for reassurance makes it clear.

Trainor builds her up, over and over again, Oh-my-God-ing over Nicoles vocals, but also clearly directing her: what kind of tone, what kind of dynamic. Even from her slumped position on the couch shes commanding, as she tells Nicole sing with me, until its clear that Nicole has gotten it right. Wait she directs Nicole during a vocal run, and then, never missing a beat, reminds her, Its a quick but here.

Then look what happens at 2:41. Meghan lays down some vocals, and Nicole, stunned, comments, Wow, she just does it in one take and then puts her head in her hands. As @ceas89 put it, Nicole is legitimately shook, and its true. Its so easy for you, she comments to Meghan, who laughs. No self-deprecation this time

Because its clearly true. Its easy for Meghan. Its easy for her and necessary for her to tell Nicole how the song needs to be done, because she wrote it. Did you know Meghan Trainor wrote songs? I mean, I guess I assumed she wrote her own songs, but I didnt know shed also written songs for Jason Derulo, Jason Mraz, Faith Hill and Tim McGraw, Michael Bubl, and Jennifer Lopez.

I didnt know. Not really. And I should have.

I loved All About The Bass like everyone else, but as much as I appreciated the retro-bop style, I didnt love the messaging about what boys need, and the follow-ups of Dear Future Husband and Lips Are Moving seemed so retro that I lost interest. Not that Trainor cares her career has been incredibly successful whether or not I like her messaging or packaging or not.

But lets be real: I underestimated Meghan Trainor, maybe because of how she was marketed. I saw her as a gimmicky artist and look, her image does have inherent gimmick-ness to it but I let it cloud me to the phenomenal talent thats gotten her this far, and Im mad at myself for doing it. Why do talented people have to have a certain image? If she was in plaid button-downs and less lipstick in her videos, would I have seen it more? If there were more videos like this out there alongside her cutesy finger-wagging videos?

Im not saying we have to like everyone, and there are millions of talented people, and entire genres of entertainment that are just not for me, the same way some people cant abide musicals, or animation, or non-fiction. But this video was a reminder that even though I think I see through all kinds of showbiz packaging, the stuff I really value hard work, genius-level talent, industry respect is totally separate from the packaging and marketing of an artist, and I let this one slip by me before now. Thank you, @ceas89, for the education.

Whose talent do you think everyone else is sleeping on? (No, Lainey, BTS does not apply here.) Is there someone youve discovered recently, since consuming entertainment has become our collective new job? Do you have a method for exposing yourself to new stuff so you dont make snap judgments like I did? Theres nothing I love more than a celebrity surprise, so if theres someone I, or all of us, should know, hit us up. A new discovery is a delightful comfort and joy right now.

See more here:
Meghan Trainor shows her work in behind the scenes video of "Genetics" performance with Nicole Scherzinger of - LaineyGossip

Read More...

Suspect In 1981 Murder Of 17-Year-Old Sacramento Girl Identified Thanks To Genetic Genealogy – CBS Sacramento

Friday, April 24th, 2020

SACRAMENTO (CBS13) A brutal Sacramento murder case that went cold for nearly four decades has been solved thanks to genetic genealogy, detectives say.

Mary London was a 17-year-old sophomore at Sacramento High School. On the morning of Jan. 15, 1981, her body was found dumped on the side of what once was a rural stretch of San Juan Road; she had been stabbed multiple times, police said.

The case went cold and no suspect was ever identified.

That is until Thursday, when the Sacramento Police Department and Sacramento County District Attorneys Office announced that they had identified a suspect in the case.

Detectives say genetic genealogy and transitional DNA have linked a man named Vernon Parker to crime.

Investigative genetic genealogy has revolutionized law enforcements ability to solve violent crime: to identify the guilty and exonerate the innocent, said District Attorney Anne MarieSchubert in a statement about the case.

No other information, including what may have led up to the killing, was released and Parker was murdered a little over a year after Marys death, detectives say.

Genetic genealogy has helped identify a number of suspects in cases that had gone cold. The technique came to prominence in 2018 when it was credited with helping identify Joseph DeAngelo as the suspect in the Golden State Killer/East Area Rapist case.

Go here to see the original:
Suspect In 1981 Murder Of 17-Year-Old Sacramento Girl Identified Thanks To Genetic Genealogy - CBS Sacramento

Read More...

Page 10«..9101112..2030..»


2025 © StemCell Therapy is proudly powered by WordPress
Entries (RSS) Comments (RSS) | Violinesth by Patrick