A definição evoluída do termo "gene"

sexta-feira, junho 30, 2017

The Evolving Definition of the Term “Gene”

Petter Portin and Adam Wilkins

GENETICS April 1, 2017 vol. 205 no. 4 1353-1364; https://doi.org/10.1534/genetics.116.196956


Abstract

This paper presents a history of the changing meanings of the term “gene,” over more than a century, and a discussion of why this word, so crucial to genetics, needs redefinition today. In this account, the first two phases of 20th century genetics are designated the “classical” and the “neoclassical” periods, and the current molecular-genetic era the “modern period.” While the first two stages generated increasing clarity about the nature of the gene, the present period features complexity and confusion. Initially, the term “gene” was coined to denote an abstract “unit of inheritance,” to which no specific material attributes were assigned. As the classical and neoclassical periods unfolded, the term became more concrete, first as a dimensionless point on a chromosome, then as a linear segment within a chromosome, and finally as a linear segment in the DNA molecule that encodes a polypeptide chain. This last definition, from the early 1960s, remains the one employed today, but developments since the 
1970s have undermined its generality. Indeed, they raise questions about both the utility of the concept of a basic “unit of inheritance” and the long implicit belief that genes are autonomous agents. Here, we review findings that have made the classic molecular definition obsolete and propose a new one based on contemporary knowledge.

GENE STRUCTURE FUNCTION GENE NETWORKS REGULATION THEORY

Copyright © 2017 by the Genetics Society of America

+++++

Subscription or payment needed/Requer assinatura ou pagamento: Genetics

Professores, pesquisadores e alunos de universidades públicas e privadas com acesso ao site Portal de Periódicos CAPES/MEC podem ler gratuitamente este artigo da Genetics e de mais 33.000 publicações científicas.

As hipóteses de design se comportam como hipóteses céticas (ou: por que nós não podemos saber a falsidade das hipóteses de design)

Design Hypotheses Behave Like Skeptical Hypotheses (or: Why We Can’t Know the Falsity of Design Hypotheses)

Authors: René van Woudenberg1 and Jeroen de Ridder1

Source: International Journal for the Study of Skepticism, Volume 7, Issue 2, pages 69 – 90 Publication Year : 2017

DOI: 10.1163/22105700-20171192
ISSN: 2210-5697 E-ISSN: 2210-5700

Document Type: Research Article

Subjects: Philosophy

Keywords: skeptical theism; radical skepticism; proper function; teleology; design

Source/Fonte: The Logical Place

Abstract

It is often claimed that, as a result of scientific progress, we now know that the natural world displays no design. Although we have no interest in defending design hypotheses, we will argue that establishing claims to the effect that we know the denials of design hypotheses is more difficult than it seems. We do so by issuing two skeptical challenges to design-deniers. The first challenge draws inspiration from radical skepticism and shows how design claims are at least as compelling as radical skeptical scenarios in undermining knowledge claims, and in fact probably more so. The second challenge takes its cue from skeptical theism and shows how we are typically not in an epistemic position to rule out design.

Affiliations: 1: Vrije Universiteit Amsterdam, r.van.woudenberg@vu.nl; g.j.de.ridder@vu.nl

+++++

Subscription or payment needed/Requer assinatura ou pagamento:

International Journal for the Study of Skepticism 

+++++

NOTA DESTE BLOGGER:

Em nossos debates com os negacionistas do design, especialmente os biólogos evolucionistas, de modo civil procuramos demonstrar o absurdo da negação do design encontrado na natureza, especialmente em sistemas biológicos de complexidade irredutível e da informação complexa especificada digitalizada do DNA. Aqui, dois autores não proponentes do design inteligente, fazem isso com maestria epistêmica.

Dois tipos líquidos de água

quinta-feira, junho 29, 2017

Diffusive dynamics during the high-to-low density transition in amorphous ice

Fivos Perakis a,b,1, Katrin Amann-Winkel a,1, Felix Lehmkühler c,d, Michael Sprung c, Daniel Mariedahl a, Jonas A. Sellberg e, Harshad Pathak a, Alexander Späh a, Filippo Cavalca a,b, Daniel Schlesinger a,2, Alessandro Ricci c, Avni Jain c, Bernhard Massani f, Flora Aubree f, Chris J. Benmore g, Thomas Loerting f, Gerhard Grübel c,d, Lars G. M. Pettersson a, and Anders Nilsson a,

Author Affiliations

aDepartment of Physics, AlbaNova University Center, Stockholm University, S-10691 Stockholm, Sweden;

bSLAC National Accelerator Laboratory, Menlo Park, CA 94025;

cDeutsches Elektronen-Synchrotron (DESY), 22607 Hamburg, Germany;

dHamburg Centre for Ultrafast Imaging, 22761 Hamburg, Germany;

eBiomedical and X-ray Physics, Department of Applied Physics, AlbaNova University Center, KTH Royal Institute of Technology, S-10691 Stockholm, Sweden;

fInstitute of Physical Chemistry, University of Innsbruck, A-6020 Innsbruck, Austria;

gX-ray Science Division, Advanced Photon Source, Argonne National Laboratory, Argonne, IL 60439

Edited by Pablo G. Debenedetti, Princeton University, Princeton, NJ, and approved May 31, 2017 (received for review March 31, 2017)



Liquid water exists in two different forms, new research reveals. Here, an illustration of the water molecule in front of an X-ray pattern from high-density amorphous ice, created by creating high pressures and low temperatures.

Significance

The importance of a molecular-level understanding of the properties, structure, and dynamics of liquid water is recognized in many scientific fields. It has been debated whether the observed high- and low-density amorphous ice forms are related to two distinct liquid forms. Here, we study experimentally the structure and dynamics of high-density amorphous ice as it relaxes into the low-density form. The unique aspect of this work is the combination of two X-ray methods, where wide-angle X-ray scattering provides the evidence for the structure at the atomic level and X-ray photon-correlation spectroscopy provides insight about the motion at the nanoscale, respectively. The observed motion appears diffusive, indicating liquid-like dynamics during the relaxation from the high-to low-density form.

Abstract

Water exists in high- and low-density amorphous ice forms (HDA and LDA), which could correspond to the glassy states of high- (HDL) and low-density liquid (LDL) in the metastable part of the phase diagram. However, the nature of both the glass transition and the high-to-low-density transition are debated and new experimental evidence is needed. Here we combine wide-angle X-ray scattering (WAXS) with X-ray photon-correlation spectroscopy (XPCS) in the small-angle X-ray scattering (SAXS) geometry to probe both the structural and dynamical properties during the high-to-low-density transition in amorphous ice at 1 bar. By analyzing the structure factor and the radial distribution function, the coexistence of two structurally distinct domains is observed at T = 125 K. XPCS probes the dynamics in momentum space, which in the SAXS geometry reflects structural relaxation on the nanometer length scale. The dynamics of HDA are characterized by a slow component with a large time constant, arising from viscoelastic relaxation and stress release from nanometer-sized heterogeneities. Above 110 K a faster, strongly temperature-dependent component appears, with momentum transfer dependence pointing toward nanoscale diffusion. This dynamical component slows down after transition into the low-density form at 130 K, but remains diffusive. The diffusive character of both the high- and low-density forms is discussed among different interpretations and the results are most consistent with the hypothesis of a liquid–liquid transition in the ultraviscous regime.

liquid–liquid transition glass transition amorphous ice X-ray photon-correlation spectroscopy supercooled water

Footnotes

1F.P. and K.A.-W. contributed equally to this work.

2Present address: Department of Environmental Science and Analytical Chemistry & Bolin Centre for Climate Research, Stockholm University, 114 18 Stockholm, Sweden.

3To whom correspondence should be addressed. Email: andersn@fysik.su.se.

Author contributions: F.P., K.A.-W., F.L., M.S., A.R., T.L., G.G., and A.N. designed research; F.P., K.A.-W., F.L., M.S., D.M., J.A.S., H.P., A.S., F.C., A.R., A.J., B.M., F.A., C.J.B., T.L., and A.N. performed research; F.P., K.A.-W., D.M., and D.S. analyzed data; and F.P., K.A.-W., L.G.M.P., and A.N. wrote the paper.

The authors declare no conflict of interest.

This article is a PNAS Direct Submission.

This article contains supporting information online at www.pnas.org/lookup/suppl/doi:10.1073/pnas.1705303114/-/DCSupplemental.

Freely available online through the PNAS open access option.

FREE PDF GRATIS: PNAS

Cosmologia não é ciência, é filosofia das origens do universo

quarta-feira, junho 28, 2017


The Case Against Cosmology

M. J. Disney 1



Abstract

It is argued that some of the recent claims for cosmology are grossly overblown. Cosmology rests on a very small database: it suffers from many fundamental difficulties as a science (if it is a science at all) whilst observations of distant phenomena are difficult to make and harder to interpret. It is suggested that cosmological inferences should be tentatively made and sceptically received.

FREE PDF GRATIS: arXiv

O universo acidental: a crise de fé da ciência - mero acaso, fortuita necessidade ou design inteligente?

terça-feira, junho 27, 2017

The Accidental Universe

Science’s crisis of faith

The history of science can be viewed as the recasting of phenomena that were once thought to be accidents as phenomena that can be understood in terms of fundamental causes and principles. One can add to the list of the fully explained: the hue of the sky, the orbits of planets, the angle of the wake of a boat moving through a lake, the six-sided patterns of snowflakes, the weight of a flying bustard, the temperature of boiling water, the size of raindrops, the circular shape of the sun. All these phenomena and many more, once thought to have been fixed at the beginning of time or to be the result of random events thereafter, have been explained as necessary consequences of the fundamental laws of nature—laws discovered by human beings.

This long and appealing trend may be coming to an end. Dramatic developments in cosmological findings and thought have led some of the world’s premier physicists to propose that our universe is only one of an enormous number of universes with wildly varying properties, and that some of the most basic features of our particular universe are indeed mere accidents—a random throw of the cosmic dice. In which case, there is no hope of ever explaining our universe’s features in terms of fundamental causes and principles.

It is perhaps impossible to say how far apart the different universes may be, or whether they exist simultaneously in time. Some may have stars and galaxies like ours. Some may not. Some may be finite in size. Some may be infinite. Physicists call the totality of universes the “multiverse.” Alan Guth, a pioneer in cosmological thought, says that “the multiple-universe idea severely limits our hopes to understand the world from fundamental principles.” And the philosophical ethos of science is torn from its roots. As put to me recently by Nobel Prize–winning physicist Steven Weinberg, a man as careful in his words as in his mathematical calculations, “We now find ourselves at a historic fork in the road we travel to understand the laws of nature. If the multiverse idea is correct, the style of fundamental physics will be radically changed.”

The scientists most distressed by Weinberg’s “fork in the road” are theoretical physicists. Theoretical physics is the deepest and purest branch of science. It is the outpost of science closest to philosophy, and religion. Experimental scientists occupy themselves with observing and measuring the cosmos, finding out what stuff exists, no matter how strange that stuff may be. Theoretical physicists, on the other hand, are not satisfied with observing the universe. They want to know why. They want to explain all the properties of the universe in terms of a few fundamental principles and parameters. These fundamental principles, in turn, lead to the “laws of nature,” which govern the behavior of all matter and energy. An example of a fundamental principle in physics, first proposed by Galileo in 1632 and extended by Einstein in 1905, is the following: All observers traveling at constant velocity relative to one another should witness identical laws of nature. From this principle, Einstein derived his theory of special relativity. An example of a fundamental parameter is the mass of an electron, considered one of the two dozen or so “elementary” particles of nature. As far as physicists are concerned, the fewer the fundamental principles and parameters, the better. The underlying hope and belief of this enterprise has always been that these basic principles are so restrictive that only one, self-consistent universe is possible, like a crossword puzzle with only one solution. That one universe would be, of course, the universe we live in. Theoretical physicists are Platonists. Until the past few years, they agreed that the entire universe, the one universe, is generated from a few mathematical truths and principles of symmetry, perhaps throwing in a handful of parameters like the mass of the electron. It seemed that we were closing in on a vision of our universe in which everything could be calculated, predicted, and understood.

However, two theories in physics, eternal inflation and string theory, now suggest that the same fundamental principles from which the laws of nature derive may lead to many different self-consistent universes, with many different properties. It is as if you walked into a shoe store, had your feet measured, and found that a size 5 would fit you, a size 8 would also fit, and a size 12 would fit equally well. Such wishy-washy results make theoretical physicists extremely unhappy. Evidently, the fundamental laws of nature do not pin down a single and unique universe. According to the current thinking of many physicists, we are living in one of a vast number of universes. We are living in an accidental universe. We are living in a universe uncalculable by science.

...

FREE PDF GRATIS: HARPER'S MAGAZINE

Atividade sexual frequente pode impulsionar o poder do cérebro em adultos mais velhos

segunda-feira, junho 26, 2017

Frequent Sexual Activity Predicts Specific Cognitive Abilities in Older Adults 

Hayley Wright Rebecca A. Jenks Nele Demeyere

J Gerontol B Psychol Sci Soc Sci gbx065. 


Published: 21 June 2017 Article history

Received: 27 September 2016



Abstract

Objectives:

This study replicates and extends the findings of previous research (Wright, H., & Jenks, R. A. (2016). Sex on the brain! Associations between sexual activity and cognitive function in older age. Age and Ageing, 45, 313–317. doi:10.1093/ageing/afv197) which found a significant association between sexual activity (SA) and cognitive function in older adults. Specifically, this study aimed to generalize these findings to a range of cognitive domains, and to assess whether increasing SA frequency is associated with increasing scores on a variety of cognitive tasks.

Methods:

Seventy-three participants aged 50–83 years took part in the study (38.4% male, 61.6% female). Participants completed the Addenbrooke’s Cognitive Examination-III (ACE-III) cognitive assessment and a questionnaire on SA frequency (never, monthly, or weekly), and general health and lifestyle.

Results:

Weekly SA was a significant predictor of total ACE-III, fluency, and visuospatial scores in regression models, including age, gender, education, and cardiovascular health.

Discussion:

Greater frequency of SA was associated with better overall ACE-III scores and scores on subtests of verbal fluency and visuospatial ability. Both of these tasks involve working memory and executive function, and links between sexual behavior, memory, and dopamine are discussed. The findings have implications for the maintenance of intimate relationships in later life.

Addenbrooke’s cognitive examination III, Cognition, Intimate relationships, Dopamine

Topic: aging dopamine cardiovascular system life style memory, short-term mental processes sex behavior brain memory gender executive functioning visuospatial ability elderly speech fluency cognitive ability verbal fluency

Issue Section: Brief Report

A teoria generalizada da evolução: chamada para artigos 01 de Set 2017

The Generalized Theory of Evolution
January 31 – February 3, 2018
DCLPS, University of Duesseldorf, Germany


For some decades now experts in several fields of the science of human nature, society and culture are using evolutionary models to explain their domain-specific phenomena. This led to the prominent idea, that the historical development of human culture in all or many of its facets should best be described as a Darwinian process that is not based on genes but still driven by the principles of variation, selection and reproduction. At the beginning of the 21st century, a generalized theory of evolution seems to appear as an interdisciplinary theoretical structure finding its place between likewise interdisciplinary frameworks such as system theory or action theory. Subdisciplines like evolutionary psychology, evolutionary game theory, evolutionary epistemology and the theory of a cultural evolution in general seem to provide a set of models and explanatory tools that ultimately can be seen as varieties of one and the same basic theoretical structure: a generalized theory of evolution.

The generalization of the theory of evolution had not only emphatic supporters, but was also exposed to severe critique. In any case, various interesting questions can be raised within the framework. Is a Darwinian theory of cultural evolution a proper candidate to synthesize the social sciences? What is the surplus value of evolutionary explanations? More specifically, e.g., can language, meaning and content be explained in terms of evolutionary signaling games of coordination? Which facets of biological evolutionary systems can be applied for cultural evolutionary systems and where do they differ in relevant aspects? For example, are there any, and if, what is the methodological and ontological status of replicators in the cultural realm?

The conference aims to gather answers to some of these frequently raised questions and explores recent attempts to move beyond mere qualitative theorizing in the domain of generalized evolutionary systems. By bringing together researchers with a common interest but with different backgrounds and toolboxes, we hope to inspire interdisciplinary discussions and new collaborations.

Keynote Speakers:

Daniel Dennett (Tufts University)
Eva Jablonka (Tel Aviv University)
Alex Mesoudi (University of Exeter)
Thomas Reydon (University of Hannover)
Gerhard Schurz (University of Duesseldorf)
Brian Skyrms (University of California, Irvine)

Call for papers:

We invite contributions devoted to all fields of The Generalized Theory of Evolution. Abstracts should be suitable for a 20min presentation (plus 10min discussion) and contain not more than 500 words, including some references to important work that will be addressed. They have to be in English and prepared for blind review. The title of the paper as well as the name, affiliation and e-mail address of the author must be included in a separate document. It should be clear from your abstract which authors your paper will address. Files have to be submitted via e-mail to: >.
The submission deadline is September 1, 2017. Authors will be notified by September 30, 2017.

Organization:

DCLPS, University of Duesseldorf: Karim Baraghith, Christian J. Feldbacher-Escamilla, Corina Stroessner, Gerhard Schurz

Please do not hesitate to contact us for any further information:

Important dates and links:
Submission deadline: September 1, 2017
Notification deadline: September 30, 2017
E-Mail: christian.feldbacher-escamilla@hhu.de

Estruturas moleculares guiam a engenharia da cromatina: mero acaso, fortuita necessidade ou design inteligente?

sexta-feira, junho 23, 2017

Molecular structures guide the engineering of chromatin 

Stefan J. Tekel Karmella A. Haynes

Nucleic Acids Res gkx531. DOI: https://doi.org/10.1093/nar/gkx531

Article History

Published: 13 June 2017 Received: 29 March 2017

Revision Received: 18 May 2017 Accepted: 07 June 2017



Abstract

Chromatin is a system of proteins, RNA, and DNA that interact with each other to organize and regulate genetic information within eukaryotic nuclei. Chromatin proteins carry out essential functions: packing DNA during cell division, partitioning DNA into sub-regions within the nucleus, and controlling levels of gene expression. There is a growing interest in manipulating chromatin dynamics for applications in medicine and agriculture. Progress in this area requires the identification of design rules for the chromatin system. Here, we focus on the relationship between the physical structure and function of chromatin proteins. We discuss key research that has elucidated the intrinsic properties of chromatin proteins and how this information informs design rules for synthetic systems. Recent work demonstrates that chromatin-derived peptide motifs are portable and in some cases can be customized to alter their function. Finally, we present a workflow for fusion protein design and discuss best practices for engineering chromatin to assist scientists in advancing the field of synthetic epigenetics.

Topic: gene expression fusion protein chromatin dna engineering histones peptides epigenetics

FREE PDF GRATIS: Nucleic Acids Research

Olhos de mariposas inspiram nova tecnologia antirreflexo: mas o design não é mera ilusão na natureza?

quinta-feira, junho 22, 2017

Broadband antireflection film with moth-eye-like structure for flexible display applications

Guanjun Tan, Jiun-Haw Lee, Yi-Hsin Lan, Mao-Kuo Wei, Lung-Han Peng, I-Chun Cheng, and Shin-Tson Wu

Optica Vol. 4, Issue 7, pp. 678-683 (2017) 


Source/Fonte: Inside Science

Abstract

Sunlight readability is a critical requirement for display devices, especially for mobile displays. Anti-reflection (AR) films can greatly improve sunlight readability by reducing the surface reflection. In this work, we demonstrate a broadband moth-eye-like AR surface on a flexible substrate, intended for flexible display applications. The moth-eye-like nanostructure was fabricated by an imprinting process onto a flexible substrate with a thin hard-coating film. The proposed nanostructure exhibits excellent AR with luminous reflectance <0 .23="" class="GINGER_SOFTWARE_mark" ginger_software_uiphraseguid="51ca75b9-c963-4ac2-a642-2dcd81316c92" gs="" id="352f64d0-7f8e-46ec-8b8c-c89f6aa8567d">%<0 .23="" gs="">% and haze below 1% with indistinguishable image quality deterioration. A rigorous numerical model is developed to simulate and optimize the optical behaviors. Excellent agreement between the experiment and simulation is obtained. Meanwhile, the nanostructure shows robust mechanical characteristics (pencil hardness >3  H>3  H), which is favorable for touch panels. A small bending radius (8 mm) was also demonstrated, which makes the proposed nanostructure applicable for flexible displays. Additionally, a fluoroalkyl coating was applied onto the moth-eye-like surface to improve the hydrophobicity (with a water contact angle >100°>100°). Such a self-cleaning feature helps protect touch panels from dust and fingerprints. The proposed moth-eye-like AR film is expected to find widespread applications for sunlight readable flexible and curved displays.

© 2017 Optical Society of America

FREE PDF GRATIS: Optica

Filogenômica levando em conta a homologia em escalas de gigabase

Homology-Aware Phylogenomics at Gigabase Scales

M. J. Sanderson Marius Nicolae M. M. McMahon

Syst Biol (2017) 66 (4): 590-603. DOI: https://doi.org/10.1093/sysbio/syw104

Published: 25 January 2017 Article history

Received: 19 May 2016 Revision Received: 31 October 2016

Accepted: 25 November 2016



Abstract

Obstacles to inferring species trees from whole genome data sets range from algorithmic and data management challenges to the wholesale discordance in evolutionary history found in different parts of a genome. Recent work that builds trees directly from genomes by parsing them into sets of small 

k-mer strings holds promise to streamline and simplify these efforts, but existing approaches do not account well for gene tree discordance. We describe a “seed and extend” protocol that finds nearly exact matching sets of orthologous k-mers and extends them to construct data sets that can properly account for genomic heterogeneity. Exploiting an efficient suffix array data structure, sets of whole genomes can be parsed and converted into phylogenetic data matrices rapidly, with contiguous blocks of k-mers from the same chromosome, gene, or scaffold concatenated as needed. Phylogenetic trees constructed from highly curated rice genome data and a diverse set of six other eukaryotic whole genome, transcriptome, and organellar genome data sets recovered trees nearly identical to published phylogenomic analyses, in a small fraction of the time, and requiring many fewer parameter choices. Our method’s ability to retain local homology information was demonstrated by using it to characterize gene tree discordance across the rice genome, and by its robustness to the high rate of interchromosomal gene transfer found in several rice species.

FREE PDF GRATIS: Syst Biol

Psiu! O genoma humano não foi totalmente sequenciado!!!

Psst, the human genome was never completely sequenced. Some scientists say it should be

By SHARON BEGLEY @sxbegle JUNE 20, 2017


The feat made headlines around the world: “Scientists Say Human Genome is Complete,” the New York Times announced in 2003. “The Human Genome,” the journals Science and Nature said in identical ta-dah cover lines unveiling the historic achievement.

There was one little problem.
“As a matter of truth in advertising, the ‘finished’ sequence isn’t finished,” said Eric Lander, who led the lab at the Whitehead Institute that deciphered more of the genome for the government-funded Human Genome Project than any other. “I always say ‘finished’ is a term of art.”
“It’s very fair to say the human genome was never fully sequenced,” Craig Venter, another genomics luminary, told STAT.
“The human genome has not been completely sequenced and neither has any other mammalian genome as far as I’m aware,” said Harvard Medical School bioengineer George Church, who made key early advances in sequencing technology.
What insiders know, however, is not well-understood by the rest of us, who take for granted that each A, T, C, and G that makes up the DNA of all 23 pairs of human chromosomes has been completely worked out. When scientists finished the first draft of the human genome, in 2001, and again when they had the final version in 2003, no one lied, exactly. FAQs from the National Institutes of Health refer to the sequence’s “essential completion,” and to the question, “Is the human genome completely sequenced?” they answer, “Yes,” with the caveat — that it’s “as complete as it can be” given available technology.
Perhaps nobody paid much attention because the missing sequences didn’t seem to matter. But now it appears they may play a role in conditions such as cancer and autism.
“A lot of people in the 1980s and 1990s [when the Human Genome Project was getting started] thought of these regions as nonfunctional,” said Karen Miga, a molecular biologist at the University of California, Santa Cruz. “But that’s no longer the case.” Some of them, called satellite regions, misbehave in some forms of cancer, she said, “so something is going on in these regions that’s important.”
Miga regards them as the explorer Livingstone did Africa — terra incognita whose inaccessibility seems like a personal affront. Sequencing the unsequenced, she said, “is the last frontier for human genetics and genomics.”
Church, too, has been making that point, mentioning it at both the May meeting of an effort to synthesize genomes, and at last weekend’s meeting of the International Society for Stem Cell Research. Most of the unsequenced regions, he said, “have some connection to aging and aneuploidy” (an abnormal number of chromosomes such as what occurs in Down syndrome). Church estimates 4 percent to 9 percent of the human genome hasn’t been sequenced. Miga thinks it’s 8 percent.
The reason for these gaps is that DNA sequencing machines don’t read genomes like humans read books, from the first word to the last. Instead, they first randomly chop up copies of the 23 pairs of chromosomes, which total some 3 billion “letters,” so the machines aren’t overwhelmed. The resulting chunks contain from 1,000 letters (during the Human Genome Project) to a few hundred (in today’s more advanced sequencing machines). The chunks overlap. Computers match up the overlaps, assembling the chunks into the correct sequence.
...
Read more here/Leia mais aquiSTAT

Filmada a replicação do DNA pela primeira vez: não é o que era esperado!

quarta-feira, junho 21, 2017

Independent and Stochastic Action of DNA Polymerases in the Replisome

James E. Graham3, Kenneth J. Marians , Stephen C. Kowalczykowski4

3Present address: Oxford Nanopore Technologies, Edmund Cartwright House, 4 Robert Robinson Avenue, Oxford Science Park, Oxford OX4 4GA, United Kingdom

4Lead Contact


Article Info

Publication History

Published: June 15, 2017 Accepted: May 26, 2017

Received in revised form: March 29, 2017 Received: November 9, 2016



Highlights

• Leading- and lagging-strand polymerases function autonomously within a replisome

• Replication is kinetically discontinuous and punctuated by pauses and rate-switches

• The helicase slows in a self-regulating fail-safe mechanism when synthesis pauses

• Priming is scaled to a 5-fold reduced processivity of the lagging-strand polymerase

Summary

It has been assumed that DNA synthesis by the leading- and lagging-strand polymerases in the replisome must be coordinated to avoid the formation of significant gaps in the nascent strands. Using real-time single-molecule analysis, we establish that leading- and lagging-strand DNA polymerases function independently within a single replisome. Although average rates of DNA synthesis on leading and lagging strands are similar, individual trajectories of both DNA polymerases display stochastically switchable rates of synthesis interspersed with distinct pauses. DNA unwinding by the replicative helicase may continue during such pauses, but a self-governing mechanism, where helicase speed is reduced by ∼80%, permits recoupling of polymerase to helicase. These features imply a more dynamic, kinetically discontinuous replication process, wherein contacts within the replisome are continually broken and reformed. We conclude that the stochastic behavior of replisome components ensures complete DNA duplication without requiring coordination of leading- and lagging-strand synthesis.

FREE PDF GRATIS: Cell