Archive for Ciencia

El láser más grande del mundo estruja diamantes con presión planetaria

La gigantesca máquina láser del Lawrence Livermore National Laboratory (EE UU), que aparece en la película Star Trek en la oscuridad en el papel del núcleo de la nave Enterprise, ha logrado comprimir el duro diamante a 5 terapascales, una … Continue reading

Welcome to Thesisland

When I joined Quantum Diaries, I did so with trepidation: while it was an exciting opportunity, I was worried that all I could write about was the process of writing a thesis and looking for postdoc jobs. I ended up telling the site admin exactly that: I only had time to work on a thesis and job hunt. I thought I was turning down the offer. But the reply I got was along the lines of “It’s great to know what topics you’ll write about! When can we expect a post?”. So, despite the fact that this is a very different topic from any recent QD posts, I’m starting a series about the process of writing a physics PhD thesis. Welcome.

The main thesis editing desk: laptop, external monitor keyboard mouse; coffee, water; notes; and lots of encouragement.

The main thesis editing desk: laptop, external monitor keyboard mouse; coffee, water; notes; and lots of encouragement.

There are as many approaches to writing a PhD thesis as there are PhDs, but they can be broadly described along a spectrum.

On one end is the “constant documentation” approach: spend some fixed fraction of your time on documenting every project you work on. In this approach, the writing phase is completely integrated with the research work, and it’s easy to remember the things you’re writing about. There is a big disadvantage: it’s really easy to write too much, to spend too much time writing and not enough doing, or otherwise un-balance your time. If you keep a constant fraction of your schedule dedicated to writing, and that fraction is (in retrospect) too big, you’ve lost a lot of time. But you have documented everything, which everyone who comes after will be grateful for. If they ever see your work.

The other end of the spectrum is the “write like hell” approach (that is, write as fast as you can), where all the research is completed and approved before writing starts. This has the advantage that if you (and your committee) decide you’ve written enough, you immediately get a PhD! The disadvantage is that if you have to write about old projects, you’ll probably have forgotten a lot. So this approach typically leads to shorter theses.

These two extremes were first described to me (see the effect of thesis writing? It’s making my blog voice go all weird and passive) by two professors who were in grad school together and still work together. Each took one approach, and they both did fine, but the “constant documentation” thesis was at least twice (or was it three times?) as long as the “write like hell” thesis.

Somewhere between those extremes is the funny phenomenon of the “staple thesis”: a thesis primarily composed of all the papers you wrote in grad school, stapled together. A few of my friends have done this, but it’s not common in my research group because our collaboration is so large. I’ll discuss that in more detail later.

I’m going for something in the middle: as soon as I saw a light at the end of the tunnel, I wanted to start writing, so I downloaded the UW latex template for PhD theses and started filling it in. It’s been about 14 months since then, with huge variations in the writing/research balance. To help balance between the two approaches, I’ve found it helpful to keep at least some notes about all the physics I do, but nothing too polished: it’s always easier to start from some notes, however minimal, than to start from nothing.

When I started writing, there were lots of topics available that needed some discussion: history and theory, my detector, all the calibration work I did for my master’s project–I could have gone full-time writing at that point and had plenty to do. But my main research project wasn’t done yet. So for me, it’s not just a matter of balancing “doing” with “documenting”; it’s also a question of balancing old documentation with current documentation. I’ve almost, *almost* finished writing the parts that don’t depend on my work from the last year or so. In the meantime, I’m still finishing the last bits of analysis work.

It’s all a very long process. How many readers are looking towards writing a thesis later on? How many have gone through this and found a method that served them well? If it was fast and relatively low-stress, would you tell me about it?

Welcome to Thesisland

When I joined Quantum Diaries, I did so with trepidation: while it was an exciting opportunity, I was worried that all I could write about was the process of writing a thesis and looking for postdoc jobs. I ended up telling the site admin exactly that: I only had time to work on a thesis and job hunt. I thought I was turning down the offer. But the reply I got was along the lines of “It’s great to know what topics you’ll write about! When can we expect a post?”. So, despite the fact that this is a very different topic from any recent QD posts, I’m starting a series about the process of writing a physics PhD thesis. Welcome.

The main thesis editing desk: laptop, external monitor keyboard mouse; coffee, water; notes; and lots of encouragement.

The main thesis editing desk: laptop, external monitor keyboard mouse; coffee, water; notes; and lots of encouragement.

There are as many approaches to writing a PhD thesis as there are PhDs, but they can be broadly described along a spectrum.

On one end is the “constant documentation” approach: spend some fixed fraction of your time on documenting every project you work on. In this approach, the writing phase is completely integrated with the research work, and it’s easy to remember the things you’re writing about. There is a big disadvantage: it’s really easy to write too much, to spend too much time writing and not enough doing, or otherwise un-balance your time. If you keep a constant fraction of your schedule dedicated to writing, and that fraction is (in retrospect) too big, you’ve lost a lot of time. But you have documented everything, which everyone who comes after will be grateful for. If they ever see your work.

The other end of the spectrum is the “write like hell” approach (that is, write as fast as you can), where all the research is completed and approved before writing starts. This has the advantage that if you (and your committee) decide you’ve written enough, you immediately get a PhD! The disadvantage is that if you have to write about old projects, you’ll probably have forgotten a lot. So this approach typically leads to shorter theses.

These two extremes were first described to me (see the effect of thesis writing? It’s making my blog voice go all weird and passive) by two professors who were in grad school together and still work together. Each took one approach, and they both did fine, but the “constant documentation” thesis was at least twice (or was it three times?) as long as the “write like hell” thesis.

Somewhere between those extremes is the funny phenomenon of the “staple thesis”: a thesis primarily composed of all the papers you wrote in grad school, stapled together. A few of my friends have done this, but it’s not common in my research group because our collaboration is so large. I’ll discuss that in more detail later.

I’m going for something in the middle: as soon as I saw a light at the end of the tunnel, I wanted to start writing, so I downloaded the UW latex template for PhD theses and started filling it in. It’s been about 14 months since then, with huge variations in the writing/research balance. To help balance between the two approaches, I’ve found it helpful to keep at least some notes about all the physics I do, but nothing too polished: it’s always easier to start from some notes, however minimal, than to start from nothing.

When I started writing, there were lots of topics available that needed some discussion: history and theory, my detector, all the calibration work I did for my master’s project–I could have gone full-time writing at that point and had plenty to do. But my main research project wasn’t done yet. So for me, it’s not just a matter of balancing “doing” with “documenting”; it’s also a question of balancing old documentation with current documentation. I’ve almost, *almost* finished writing the parts that don’t depend on my work from the last year or so. In the meantime, I’m still finishing the last bits of analysis work.

It’s all a very long process. How many readers are looking towards writing a thesis later on? How many have gone through this and found a method that served them well? If it was fast and relatively low-stress, would you tell me about it?

Prototype CT scanner could improve targeting accuracy in proton therapy treatment

This article appeared in Fermilab Today on July 21, 2014.

Members of the prototype proton CT scanner collaboration move the detector into the CDH Proton Center in Warrenville. Photo: Reidar Hahn

Members of the prototype proton CT scanner collaboration move the detector into the CDH Proton Center in Warrenville. Photo: Reidar Hahn

A prototype proton CT scanner developed by Fermilab and Northern Illinois University could someday reduce the amount of radiation delivered to healthy tissue in a patient undergoing cancer treatment.

The proton CT scanner would better target radiation doses to the cancerous tumors during proton therapy treatment. Physicists recently started testing with beam at the CDH Proton Center in Warrenville.

To create a custom treatment plan for each proton therapy patient, radiation oncologists currently use X-ray CT scanners to develop 3-D images of patient anatomy, including the tumor, to determine the size, shape and density of all organs and tissues in the body. To make sure all the tumor cells are irradiated to the prescribed dose, doctors often set the targeting volume to include a minimal amount of healthy tissue just outside the tumor.

Collaborators believe that the prototype proton CT, which is essentially a particle detector, will provide a more precise 3-D map of the patient anatomy. This allows doctors to more precisely target beam delivery, reducing the amount of radiation to healthy tissue during the CT process and treatment.

“The dose to the patient with this method would be lower than using X-ray CTs while getting better precision on the imaging,” said Fermilab’s Peter Wilson, PPD associate head for engineering and support.

Fermilab became involved in the project in 2011 at the request of NIU’s high-energy physics team because of the laboratory’s detector building expertise.

The project’s goal was a tall order, Wilson explained. The group wanted to build a prototype device, imaging software and computing system that could collect data from 1 billion protons in less than 10 minutes and then produce a 3-D reconstructed image of a human head, also in less than 10 minutes. To do that, they needed to create a device that could read data very quickly, since every second data from 2 million protons would be sent from the device — which detects only one proton at a time — to a computer.

NIU physicist Victor Rykalin recommended building a scintillating fiber tracker detector with silicon photomultipliers. A similar detector was used in the DZero experiment.

“The new prototype CT is a good example of the technical expertise of our staff in detector technology. Their expertise goes back 35 to 45 years and is really what makes it possible for us to do this,” Wilson said.

In the prototype CT, protons pass through two tracking stations, which track the particles’ trajectories in three dimensions. (See figure.) The protons then pass through the patient and finally through two more tracking stations before stopping in the energy detector, which is used to calculate the total energy loss through the patient. Devices called silicon photomultipliers pick up signals from the light resulting from these interactions and subsequently transmit electronic signals to a data acquisition system.

In the prototype proton CT scanner, protons enter from the left, passing through planes of fibers and the patient's head. Data from the protons' trajectories, including the energy deposited in the patient, is collected in a data acquisition system (right), which is then used to map the patient's tissue. Image courtesy of George Coutrakon, NIU

In the prototype proton CT scanner, protons enter from the left, passing through planes of fibers and the patient’s head. Data from the protons’ trajectories, including the energy deposited in the patient, is collected in a data acquisition system (right), which is then used to map the patient’s tissue. Image courtesy of George Coutrakon, NIU

Scientists use specialized software and a high-performance computer at NIU to accurately map the proton stopping powers in each cubic millimeter of the patient. From this map, visually displayed as conventional CT slices, the physician can outline the margins, dimensions and location of the tumor.

Elements of the prototype were developed at both NIU and Fermilab and then put together at Fermilab. NIU developed the software and computing systems. The teams at Fermilab worked on the design and construction of the tracker and the electronics to read the tracker and energy measurement. The scintillator plates, fibers and trackers were also prepared at Fermilab. A group of about eight NIU students, led by NIU’s Vishnu Zutshi, helped build the detector at Fermilab.

“A project like this requires collaboration across multiple areas of expertise,” said George Coutrakon, medical physicist and co-investigator for the project at NIU. “We’ve built on others’ previous work, and in that sense, the collaboration extends beyond NIU and Fermilab.”

Rhianna Wisniewski

Accelerators Take Center Stage at Stanford-hosted Workshop

Physicists from SLAC's Advanced Accelerator Research department joined more than 250 of their colleagues from around the world to explore cutting-edge accelerator techniques for the next generation of accelerators – and beyond.

read more

Scientists set aside rivalry to preserve knowledge

Scientists from two experiments have banded together to create a single comprehensive record of their work for scientific posterity.

Imagine Argentina and Germany, the 2014 World Cup finalists, meeting after the final match to write down all of their strategies, secrets and training techniques to give to the world of soccer.

This will never happen in the world of sports, but it just happened in the world of particle physics, where the goal of solving the puzzles of the universe belongs to all.

Two independent research teams from opposite sides of the Pacific Ocean that have been in friendly competition to discover why there is more matter than antimatter in the universe have just released a joint scientific memoir, The Physics of the B Factories.

The 900-page, three-inch-thick tome documents the experiments—BaBar, at the Department of Energy’s SLAC National Accelerator Laboratory in California, and Belle, at KEK in Tsukuba, Japan—as though they were the subject of a paper for a journal.

The effort took six years and involved thousands of scientists from all over the world

“Producing something like this is a massive undertaking but brings a lot of value to the community,” says Tim Knight, a physicist at SLAC who was not involved in either experiment. “It’s a thorough summary of the B-factory projects, their history and their physics results. But more than that, it is an encyclopedia of elegant techniques in reconstruction and data analysis that are broadly applicable in high energy physics. It makes an excellent reference from which nearly any student can learn something valuable.”

BaBar and Belle were built to find the same thing: CP violation, a difference in the way matter and antimatter behave that contributes to the preponderance of matter in the universe. And they went about their task in essentially the same way: They collided electrons and their antimatter opposites, positrons, to create pairs of bottom and anti-bottom quarks. So many pairs, in fact, that the experiments became known as B factories—thus, the book title.

Both experiments were highly successful in their search, though what they found can’t account for the entire discrepancy. The experiments also discovered several new particles and studied rare decays.

In the process of finding CP violation they verified a theoretical model, called the CKM matrix, which describes certain types of particle decays. In 2008, Japanese theorists Makoto Kobayashi and Toshihide Maskawa—the “K” and the “M” of CKM—shared the Nobel Prize for their thus-verified model. The two physicists sent BaBar and Belle a thank-you note.

Meanwhile, Francois Le Diberder, the BaBar spokesperson at the time, had an idea.

“It’s Francois’ fault, really,” says Adrian Bevan, a physicist at Queen Mary University of London and long-time member of the BaBar collaboration. “In 2008 he said, ‘We should document the great work in the collaboration.’ The idea just resonated with a few of us. And then Francois said, ‘Let’s invite KEK, as it would be much better to document both experiments.’“

Bevan and a few like-minded BaBar members, such as Soeren Prell from Iowa State University, contacted their Belle counterparts and found them receptive to the idea. They recruited more than 170 physicists to help and spent six years planning, writing, editing and revising. Almost 2000 names appear in the list of contributors; five people, including Bevan, served as editors. Nobel laureates Kobayashi and Masakawa provided the foreward.

The book has many uses, according to Bevan: It’s a guide to analyzing Belle and BaBar data; a reference for other experiments; a teaching tool. Above all, it’s a way to keep the data relevant. Instead of becoming like obsolete alphabets for dead languages, as has happened with many old experiments, BaBar and Belle data can continue to be used for new discoveries. “This, along with long term data access projects, changes the game for archiving data,” Bevan says.

In what may or may not have been a coincidence, the completion of the manuscript coincided with the 50th anniversary of the discovery of CP violation. At a workshop organized to commemorate the anniversary, Bevan and his co-editors presented three specially bound copies of the book to three giants of the field: Nobel laureate James Cronin (pictured above, accepting his copy), one of the physicists who made that first discovery 50 years before, and old friends Kobayashi, who accepted in person, and Masakawa, who sent a representative.

Bevan jokes that Le Diberder cost them six years of hard labor, but the instigator of the project is unrepentant.

“Indeed, the idea is my fault,” Le Diberder, who is now at France’s Linear Accelerator Laboratory, says. “But the project itself got started thanks to Adrian and Soeren, who stepped forward to steward the ship. Once they gathered their impressive team they no longer needed my help except for behind-the-scenes tasks. They had the project well in hand.”

Bevan isn’t sure about the “well in hand” characterization. “It took a few years longer than we thought it would because we didn’t realize the scope of the thing,” Bevan says. “But the end result is spectacular.

“It’s War and Peace for physicists.”

 

Like what you see? Sign up for a free subscription to symmetry!

ABANDONAR EL PLANETA TIERRA

ABANDONAR EL PLANETA TIERRA ABANDONAR EL PLANETA TIERRA Rating 0/10 Views 9 Duration 01:30:01 ... Continue reading

Science inspires at Sanford Lab’s Neutrino Day

Science was the star at an annual celebration in Lead, South Dakota.

At the Sanford Underground Research Facility’s seventh annual Neutrino Day last Saturday, more than 800 visitors of all ages and backgrounds got a glimpse of the high-energy physics experiments underway a mile below the streets of Lead, South Dakota.

After decades as a mining town, Lead has transformed in recent years into a science town. From within America’s largest and deepest underground mine, where hundreds of miners once pulled gold from the earth, more than a hundred scientists now glean insights into the mysteries of the universe.

“We don’t have a lot of vendors or food at Neutrino Day. It’s all science,” says Constance Walter, Sanford Lab’s communications director. “Our hope is that even people who didn’t before have a real interest in science will get excited. We want them to understand what we’re doing at Sanford Lab and the impact it can have on the region.”

This year, the festivities included tours of the above-ground facilities, live video chats with scientists and rescue personnel a mile underground (pictured above), a planetarium presentation, and hands-on science demos including the opportunity for kids to build battery-operated robots, use air pressure to change the size of marshmallows and learn about circuits using a conductive dough.

Science lectures also drew large crowds. Tennessee Technological University Professor Mary Kidd introduced attendees to the Majorana Demonstrator, which seeks to determine whether the neutrino is its own antiparticle and offer insight into the mass of neutrinos. Brookhaven National Laboratory physicist Milind Diwan wowed the crowd with his descriptions of the strange behavior of neutrinos and their many mysteries. And, in the keynote presentation, cosmologist Joel Primack and cultural philosopher Nancy Ellen Abrams discussed some of the most mindboggling unknowns in the universe—including the nature of dark matter and dark energy.

The highlight for 8th-grader Zoe Keehn was, without a doubt, a production put on by more than 30 local schoolchildren. Keehn played Hannah, the lead role in the NASA-sponsored Space School Musical. Hannah’s science project, a model of the solar system, is due tomorrow but it’s already past her bedtime. As she works to get it finished, Hannah’s friends—our solar system’s planets, moons, meteors, comets and asteroids—come out to help her with fun facts and information in the form of song.

“I’ve been in a lot of plays and musicals, and it was fun to be in a science one,” Keehn says. “I especially liked my S-P-A-C-E song. It goes, ‘The only place for me, a place I can be free, S-P-A-C-E, that’s where I’ve got to be.’”

Karen Everett, who as the executive director of the Lead-Deadwood Art Center came up with the idea of producing Space Science Musical for Neutrino Day, says that the musical was a big hit. “People just loved it,” she says. “Through art, we can educate people about science.”

For a town that lost its main source of income when the mine shut down in 2003, the lab—and Neutrino Day—also offers a much-needed economic stimulus.

“With a little over 3000 people, Lead is a small town and one that’s been transitioning from its 125-year-old mining economy,” says Everett. “It was great to see so many people in town, enjoying the event, eating at local restaurants and generally just coming out. It was a great boost for us all.”

Walter sees it as a two-way street. “I love Neutrino Day because I see people—especially kids—who are excited to learn about what we do,” she says. “As the kids get excited, so do their parents and their teachers. And that’s so great to see. We need the support of the community for the laboratory to thrive and be successful.” 

The machine learning community takes on the Higgs

Detecting new physics isn’t quite like detecting cat videos—yet.

Scientists have created a contest that invites anyone to use machine learning—the kind of computing that allows Facebook to spot your friends in photos and Netflix to recommend your next film—to search for the Higgs boson.

More than 1000 individuals have already joined the race. They’re vying for prizes up to $7000, but according to contest organizers, the real winner might be the particle physics community, whose new connections with the world of data science could push them toward new methods of discovery.

The contest works like this: Participants receive data from 800,000 simulated particle collisions from the ATLAS experiment at the Large Hadron Collider. The collisions can be sorted into two groups: those with a Higgs boson and those without.

The data for each collision contains 30 details—including variables such as the energy and direction of the particles coming out of it. Contestants receive all of these details, but only 250,000 of the collisions are labeled “Higgs” or “non-Higgs.”

They must use this labeled fraction to train their algorithms to find patterns that point to the Higgs boson. When they’re ready, they unleash the algorithms on the unlabeled collision data and try to figure out where it’s hiding.

Contestants submit their answers online to Kaggle, a company that holds the answer key. When Kaggle receives a submission, it grades in real time just a portion of it—to prevent people from gaming the system—and then places the contestant on its public leaderboard.

At the end of the Higgs contest, Kaggle will reveal whose algorithm did the best job analyzing the full dataset. The top three teams will win $7000, $4000 and $3000. In addition, whoever has the most useable algorithm will be invited to CERN to see the ATLAS detector and discuss machine learning with LHC scientists.

The contest was conceived of by a six-person group led by two senior researchers at France’s national scientific research center, CNRS: physicist David Rousseau, who served from 2010 to 2012 as software coordinator for the ATLAS experiment, and machine-learning expert Balázs Kégl, who since 2007 has been looking for ways to bring machine learning into particle physics.

The company running the contest, Kaggle, based in San Francisco, holds such challenges for research institutions and also businesses such as Liberty Mutual, Allstate, Merck, MasterCard and General Electric. They have asked data scientists to foresee the creditworthiness of loan applicants, to predict the toxicity of molecular compounds and to determine the sentiment of lines from movie reviews on the film-rating site Rotten Tomatoes.

Kaggle contests attract a mixed crowd of professional data scientists looking for fresh challenges, grad students and postdocs looking to test their skills, and newbies looking to get their feet wet, says Joyce Noah-Vanhoucke, Kaggle data scientist and head of competitions.

“We’re trying to be the home of data science on the internet,” she says.

Often contestants play for cash, but they have also competed for the chance to interview for data scientist positions at Facebook, Yelp and Walmart.

Kaggle is currently running about 20 contests on its site. Most of them will attract between 300 and 500 teams, Noah-Vanhoucke says. But the Higgs contest, which does not end until September, has already drawn almost 970. Names appear and drop off of the leaderboard every day.

“People love this type of problem,” Noah-Vanhoucke says. “It captures their imagination.”

A couple of the top contenders are physicists, but most come from outside the particle physics community. The team spent about 18 months working on organizing the contest in the hopes that it would create just this kind of crossover, Rousseau says.

“If due to this challenge physicists of the collaboration discover they have a friendly machine learning expert in the lab next door and they try to work together, that’s even better than just getting a new algorithm.”

Machine learning—known in physics circles as multivariate analysis—played a small role in the 2012 discovery of the Higgs. But physics is still about 15 years behind the cutting edge in this area, Kégl says. And it could be just what the science needs.

Artwork by: Sandbox Studio, Chicago

Until a couple of years ago, the Higgs was the last undiscovered particle of the Standard Model of particle physics.

“Physics is getting to a place where they’ve discovered everything they were looking for,” Kégl says.

Questions still remain, of course. What is dark matter? What is dark energy? Why is gravity so weak? Why is the Higgs so light?

“But the Higgs is a very specific, predicted thing,” Kégl says. “Physicists knew if it had this mass, it would decay in this way.

“Now they’re looking for stuff they don’t know. I’m really interested in methods that can find things that are not modeled yet.”

In 2012, the Google X laboratory programmed 1000 computers to look through 10 million randomly selected thumbnail images from YouTube videos. This was an example of unsupervised machine learning: The computers had no answer code; they weren’t given any goal other than to search for patterns.

And they found them. They grouped photos by categories such as human faces, human bodies—and cats. People teased that Google had created the world’s most complicated cat video detector. But joking aside, it was an impressive example of the ability of a machine to quickly organize massive amounts of data.

Physicists already research in a similar way, sorting through huge amounts of information in search of tiny signals. The clue to their next revolutionary discovery could lie in an almost unnoticeable deviation from the expected. Machine learning could be an important tool in finding it.

Physicists shouldn’t consider this a threat to job security, though. In the case of the Higgs contest, scientists needed to greatly simplify their data to make it possible for algorithms to handle it.

“A new algorithm would be a small piece of a full physics analysis,” Rousseau says. “In the future, physics will not be done by robots.”

He hopes they might help, though. The team is already planning the next competition.

 

Like what you see? Sign up for a free subscription to symmetry!

Expediente X El Manuscrito Voynich 2

Expediente X El Manuscrito Voynich 2
Rating 0/10
Views 6
Duration 23:03

Expediente X El Manuscrito Voynich 2

Expediente X El Manuscrito Voynich 2 Rating 0/10 Views 22 Duration 23:03 Continue reading

El codice Voynich El manuscrito mas misterioso del

El codice Voynich El manuscrito mas misterioso del mundo
Desde que se descubrió, el manuscrito Voynich ha suscitado todo tipo de elucubraciones. Este documental intenta descifrar las claves de este texto de más de 500 años, de autor desconocido, ilustrado con unos inquietantes dibujos y escrito en un alfabeto sin identificar.
Rating 0/10
Views 21
Duration 49:54

Continue reading