|
Post by swamprat on Oct 7, 2019 16:38:21 GMT
Hmmm..... Back-engineered from UFO artifacts? U.S. Air Force scientists developed liquid metal which autonomously changes structure Published 08:19 (GMT+0000) October 5, 2019
Photo courtesy of Raytheon
As reported by the U.S. Air Force Research Laboratory, military scientists have developed a “Terminator-like” liquid metal that can autonomously change the structure, just like in a Hollywood movie.
The scientists developed liquid metal systems for stretchable electronics – that can be bent, folded, crumpled and stretched – are major research areas towards next-generation military devices.
Conductive materials change their properties as they are strained or stretched. Typically, electrical conductivity decreases and resistance increases with stretching.
The material recently developed by Air Force Research Laboratory (AFRL) scientists, called Polymerized Liquid Metal Networks, does just the opposite. These liquid metal networks can be strained up to 700%, autonomously respond to that strain to keep the resistance between those two states virtually the same, and still return to their original state. It is all due to the self-organized nanostructure within the material that performs these responses automatically.
“This response to stretching is the exact opposite of what you would expect,” said Dr. Christopher Tabor, AFRL lead research scientist on the project. “Typically a material will increase in resistance as it is stretched simply because the current has to pass through more material. Experimenting with these liquid metal systems and seeing the opposite response was completely unexpected and frankly unbelievable until we understood what was going on.”
Wires maintaining their properties under these different kinds of mechanical conditions have many applications, such as next-generation wearable electronics. For instance, the material could be integrated into a long-sleeve garment and used for transferring power through the shirt and across the body in a way that bending an elbow or rotating a shoulder won’t change the power transferred.
AFRL researchers also evaluated the material’s heating properties in a form factor resembling a heated glove. They measured thermal response with sustained finger movement and retained a nearly constant temperature with a constant applied voltage, unlike current state-of-the-art stretchable heaters that lose substantial thermal power generation when strained due to the resistance changes.
This project started within the last year and was developed in AFRL with fundamental research dollars from the Air Force Office of Scientific Research. It is currently being explored for further development in partnership with both private companies and universities. Working with companies on cooperative research is beneficial because they take early systems that function well in the lab and optimize them for potential scale up. In this case, they will enable integration of these materials into textiles that can serve to monitor and augment human performance.
The researchers start with individual particles of liquid metal enclosed in a shell, which resemble water balloons. Each particle is then chemically tethered to the next one through a polymerization process, akin to adding links into a chain; in that way all of the particles are connected to each other.
As the connected liquid metal particles are strained, the particles tear open and liquid metal spills out. Connections form to give the system both conductivity and inherent stretchability. During each stretching cycle after the first, the conductivity increases and returns back to normal. To top it off, there is no detection of fatigue after 10,000 cycles.
“The discovery of Polymerized Liquid Metal Networks is ideal for stretchable power delivery, sensing and circuitry,” said Capt. Carl Thrasher, research chemist within the Materials and Manufacturing Directorate at AFRL and lead author on the Journal Article. “Human interfacing systems will be able to operate continuously, weigh less, and deliver more power with this technology.”
“We think this is really exciting for a multitude of applications,” he added. “This is something that isn’t available on the market today so we are really excited to introduce this to the world and spread the word.”
defence-blog.com/news/u-s-air-force-scientists-developed-liquid-metal-which-autonomously-change-structure.html
|
|
|
Post by swamprat on Oct 12, 2019 15:25:06 GMT
FSU Physics Researchers Break New Ground, Explore Unknown Energy Regions by: Kathleen Haughney | Published: October 9, 2019
Florida State graduate student Jason Barlow works on a part of the GlueX detector at Jefferson National Laboratory. FSU scientists painted the part of the GlueX detector they built garnet and gold.
Florida State University physicists are using photon-proton collisions to capture particles in an unexplored energy region, yielding new insights into the matter that binds parts of the nucleus together.
“We want to understand not just the nucleus, but everything that makes up the nucleus,” said FSU Professor of Physics Paul Eugenio. “We’re working to understand the particles and forces that make up our world.”
FSU’s hadronic physics group is a leading member of the GlueX Collaboration at the U.S. Department of Energy’s Thomas Jefferson National Accelerator Facility. The group ran highly sophisticated experiments around the clock for months at a time over several years starting in 2016. Their main goal is to ferret out new information about the material — called the gluonic field — that ties together quarks. Quarks are fundamental particles that create protons and neutrons.
In a new paper published in Physical Review Letters , the hadronic physics group at Florida State University and their collaborators laid out the first-ever measurements of a subatomic particle — called the J/psi particle — created out of the energy in the photon-proton collisions.
“It’s really cool to see,” said Assistant Professor of Physics Sean Dobbs. “This is opening up a new frontier of physics.”
When researchers run these experiments, they blast a photon beam into the GlueX spectrometer where it passes through a canister of liquid hydrogen and reacts with the protons in the nucleus of these hydrogen atoms. From there, the detectors measure the particles created in these collisions, which allows physicists to reconstruct the details of the collision and learn more about the created particles.
Dobbs compared it to a car wreck. You might not see the wreck happen, but you see the result and can work backward. In this case, researchers collected about one to two million gigabytes of data per year through this process to try to piece together the puzzle.
The J/psi particle is composed of a pair of quarks – a charm quark and an anti-charm quark. In measuring the J/psi particle in these collisions, scientists can also look for the production of other charm quark-containing subatomic particles.
The measurements were taken at an energy threshold below where previous studies looked at production levels, meaning it was more sensitive to the distribution of the gluons in the proton and their contributions to the proton mass.
Scientists found a much larger production of J/psi particles than expected, meaning this gluonic structure is a big contributor to the mass of the proton structure, and thus the nucleus as a whole. These initial measurements suggest that the gluons directly contribute more than 80 percent of the mass of the proton. Further measurements of these reactions currently underway will give more insight into how the gluons are distributed around the nucleon.
These measurements also brought into question observations from experiments on the Large Hadron Collider, a particle detector at CERN, the European Organization for Nuclear Research. Scientists there briefly glimpsed what they are calling pentaquarks – short lived particles made of five quarks.
FSU physicists did not specifically see pentaquarks in their data, which has ruled out several models which attempt to describe the structure of these pentaquarks. Further measurements underway are expected to give a more definitive answer on how the five quarks are arranged in these particles.
This work is funded by the Department of Energy and supported by the Thomas Jefferson National Accelerator Facility. The GlueX collaboration involves scientists from 29 institutions around the world.
FSU Professor Volker Credé, FSU Scientists Alexander Ostrovidov, postdoctoral fellow Daniel Lersch, and several FSU graduate students contributed to this work. The graduate students are Jason Barlow, Edmundo Barriga, Bradford Cannon, Ashley Ernst, Angelica Goncalves, and Lawrence Ng.
Thomas Jefferson National Accelerator Facility (TJNAF), commonly called Jefferson Lab or JLab, is a US National Laboratory located in Newport News, Virginia. Its stated mission is "to provide forefront scientific facilities, opportunities and leadership essential for discovering the fundamental structure of nuclear matter; to partner in industry to apply its advanced technology; and to serve the nation and its communities through education and public outreach."
Aerial view of Jefferson Lab
Schematic of the accelerator and the experimental halls after the 12 GeV energy upgrade.
Since June 1, 2006, it has been operated by Jefferson Science Associates, LLC, a limited liability company created by Southeastern Universities Research Association and PAE Applied Technologies. Until 1996 it was known as the Continuous Electron Beam Accelerator Facility (CEBAF); commonly, this name is still used for the main accelerator. Founded in 1984, Jefferson Lab employs more than 750 people, and more than 2,000 scientists from around the world have conducted research using the facility.
SOURCE: Wikipedia
news.fsu.edu/news/science-technology/2019/10/09/fsu-physics-researchers-break-new-ground-explore-unknown-energy-regions/?fbclid=IwAR2TF2TtMqLJB4y2cB2eT1PeXBrAiLGfSW2kNrJZNmcCPPkFEMucFgW4uc8
|
|
|
Post by swamprat on Oct 14, 2019 15:45:58 GMT
Refrigerator works by twisting and untwisting fibres 14 Oct 2019
Fridge-freezer: twistocaloric cooling could be coming to a kitchen near you. (Courtesy: iStock/Allevinatis)
A new refrigeration technology based on the twisting and untwisting of fibres has been demonstrated by a team led by Zunfeng Liu at Nankai University in China and Ray Baughman at the University of Texas at Dallas in the US. As the demand for refrigeration expands worldwide, their work could lead to the development of new cooling systems that do not employ gases that are harmful to the environment.
The cooling system relies on the fact that some materials undergo significant changes in entropy when deformed. As far back as 1805 – when the concepts of thermodynamics were first being developed – it was known that ordinary rubber heats up when stretched and cools down when relaxed. In principle, such mechanocaloric materials could be used in place of the gases that change entropy when compressed and expanded in commercial refrigeration systems. Replacing gas-based systems is an important environmental goal because gaseous refrigerants tend to degrade the ozone layer and are powerful greenhouse gases.
In their experiments, Liu and Baughman’s team studied the cooling effects of twist and stretch changes in twisted, coiled and supercoiled fibres of natural rubber, nickel-titanium and polyethylene fishing line. In each material, they observed a surface cooling as high as 16.4 ⁰C, 20.8 ⁰C, and 5.1 ⁰C respectively, which they achieved through techniques including simultaneous releases of twisting and stretching, and unravelling bundles of multiple wires.
Supercoiled fibres
The team also made supercoiled fibres of natural rubber in which the twisting and coiling were done in opposite senses (clockwise and anticlockwise). Much to their surprise, they found that these structures cooled when stretched, rather than heated.
The team also looked at microscopic changes in the materials. An X-ray diffraction crystallography study of the polyethylene fishing line revealed changes in molecular structures associated with the transition from low to high entropy phases. The team identified this process as the cause of the effect, which they have dubbed “twistocaloric” cooling.
Liu, Baughman and their colleagues then built a simple device from a three-ply nickel-titanium wire cable, which cooled a stream of running water by as much as 7.7 ⁰C as it unravelled. They propose that far higher levels of cooling could be reached through additional cycles of twisting and twist release within the cable — resulting in a highly efficient fridge.
The team faces many challenges in creating commercially-viable twist fridges, including the need find a material that is not degraded by being repeatedly twisted and untwisted. So far, they have only explored few commercially-available materials, but now plan to expand their research to seek-out materials that have optimized mechanical and twistocaloric properties. If realized on commercial scales, twist fridge technologies could provide climate-friendly solutions to meeting our rapidly expanding demand for cooling.
The research is described in Science.
physicsworld.com/a/refrigerator-works-by-twisting-and-untwisting-fibres/
|
|
|
Post by swamprat on Oct 20, 2019 16:35:15 GMT
Gee Whiz!Space Companies Are Investing Big in 5G TechnologyBy Elizabeth Howell 4 hours ago Tech
Satellite internet is going to be a big thing.
A view of SpaceX's first 60 Starlink satellites in orbit, still in stacked configuration, with the Earth as a brilliant blue backdrop on May 23, 2019. (Image: © SpaceX)
Space companies worldwide want to bring more data to your devices, faster than ever before.
Entities ranging from SpaceX to Amazon are launching (or may launch soon) huge numbers of new satellites that can carry the extra bandwidth. And cellular network providers around the world are upgrading their equipment on the ground to meet the expected future demand.
This new technology is being built out for new 5G networks. It's touted as a big leap over current 4G technology, which allows you to do data-intensive things like stream Netflix.
5G will be even better, Will Townsend, a senior analyst for market research firm Moors Insight & Strategy, told Space.com. Users will experience less latency, he said. Latency refers to the time it takes to send a packet of data to a receiver (like a cellphone) on a network. 4G networks have about 50 milliseconds of latency, and 5G networks are expected to be 10 times better, with latencies of less than 5 milliseconds.
This will result in a "faster and more responsive" experience, Townsend said in an email. "For consumers, this will equate to faster downloads and a non-buffered video playback experience," he said. "Mobile gamers will appreciate fast responsiveness." Business applications will range from remote manufacturing to telesurgery, he added, and there will be a "richer retail experience bridging online capabilities." The growth of 5G will also help to address the rise of the internet of things, or the proliferation of network-connected, or "smart," devices. There are already smart fridges, stoves and security systems, for example, and consumers are also using wearable devices that share bandwidth on crowded mobile networks.
Meanwhile, businesses have embedded tracking devices in locations such as shipping containers, oil and gas lines, and power generators, with each device providing real-time information on the status of the thing being tracked. This information is meant to make it easier for companies to respond if something breaks and to keep better track of shipments crossing the globe with manufactured goods. Whole industries may change with the rise of connected devices, such as driving (with the use of autonomous vehicles) or factories (with production lines that may be able to monitor themselves).
When is 5G coming?
In the United States, the big four carriers — AT&T, Sprint, T-Mobile and Verizon — have already launched mobile 5G in a handful of metro areas. For example, as of July, Sprint had deployed mobile 5G in parts of Atlanta, Chicago, Dallas-Fort Worth, Houston, and Kansas City, Missouri, according to an article Townsend wrote for Forbes. And deployment will continue for all carriers through the rest of 2019 and into 2020, he said.
In many cases, however, you won't be able to access the network with your older device. Once the infrastructure equipment is upgraded, consumers will need to buy new cellphones. Check your preferred brand carefully. "Samsung and Android devices will lead Apple by 18 to 24 months in handsets," Townsend said. But there is big potential for carriers, who "are spending billions globally to upgrade the networks because they see the potential in monetizing new services," he added.
On the business side, one of the big arguments for moving to 5G is the ability to participate in "Industry 4.0," or the fourth industrial revolution. This term commonly refers to factories embedded with wireless connectivity in their machines and equipment. Using emerging artificial intelligence, the goal is for the factory to monitor its own production line and to make changes as needed for safety, efficiency or other needs. Some analysts worry that AI could replace jobs and make unemployment rise, while others are optimistic, saying new job opportunities will arise with the new technology.
Which space companies are working on 5G?
Many space entities are rushing to be trendsetters in 5G. For example, SpaceX has received approval to launch nearly 12,000 Starlink internet satellites (and recently applied to loft up to 30,000 more). In May, SpaceX launched its first 60 Starlink craft, which operate at a low-Earth-orbit altitude of about 342 miles (550 kilometers). (For comparison, the International Space Station orbits about 250 miles, or 400 km, above Earth.)
OneWeb has satellite-internet plans as well. The company plans to assemble a constellation of nearly 650 satellites to make web access easier around the world. OneWeb launched the first group of six satellites in February aboard a Soyuz rocket provided by European launch company Arianespace. These satellites circle Earth in near-polar orbits, at an altitude of roughly 750 miles (1,200 km). Amazon and Facebook are among the other companies planning 5G satellite networks.
What are the risks of 5G?
The proliferation of 5G satellites in orbit raises a number of questions from industry observers. A big one is the rising risk of collisions, which could, theoretically, spawn huge populations of orbital debris. The world got an inkling of this risk last month, when a European satellite made a precautionary maneuver to dodge a potential collision with one of the SpaceX Starlink satellites.
There also are worries about radio-frequency interference with all of these coming satellites. Operators of weather satellites, in particular, are concerned about some of the authorized 5G frequencies approaching the 23.8-gigahertz frequency commonly used in weather forecasts. At this bandwidth, "water vapor in the atmosphere gives off a feeble signal," and the satellites can examine humidity in the atmosphere, even if the region is cloudy, Popular Mechanics reported. That said, both NASA and the U.S. National Oceanic and Atmospheric Administration are negotiating with the Federal Communications Commission (which allocates spectrum frequencies to U.S. companies) to protect weather satellites, according to Popular Mechanics.
There's also concern that the abundance of satellites will interfere with sky observations. In June, the International Astronomical Union (IAU) expressed concern that thousands of satellites could interfere with the ability to examine dim and distant objects, not to mention the lives of nocturnal animals. "We do not yet understand the impact of thousands of these visible satellites scattered across the night sky, and despite their good intentions, these satellite constellations may threaten both," IAU officials said in a statement at the time.
As the 5G providers work out these kinks, there may be unpredictable effects of the new mobile technology, Townsend said. "Case in point: 4G LTE brought the capabilities required to make ride sharing a reality; no one really predicted that use case," he said. Townsend called this a positive development, as it "disrupted a multibillion [dollar] taxi cab industry [and] created new income opportunity" for individuals.
www.space.com/5g-in-space-internet-satellites.html
|
|
|
Post by swamprat on Oct 21, 2019 19:48:49 GMT
When 3D printing was developed and released, it didn't take long for it to become affordable and readily available. Guess what? Soon we'll be able to build our own robotic arms! New haptic arm places robotics within easy reach Date: October 20, 2019
Source: University of Bristol
Summary:
Imagine being able to build and use a robotic device without the need for expensive, specialist kit or skills. That is the vision that researchers from the University of Bristol have now turned into reality, creating a lightweight, affordable and simple solution for everyday users.
While multiple robotic arm devices already exist, most are heavy, expensive and outside the reach of individuals who lack the expertise to use them.
Mantis, designed by experts in human-computer interaction from Bristol's team of engineers, is the first system of its kind that enables light, affordable and accessible haptic force feedback.
Human beings have five senses, but electronic devices communicate with us using predominantly just two: sight and hearing. Haptic feedback (often shortened to just haptics) changes this by simulating the sense of touch. Not only can you touch a computer or other device, but the computer can touch you back. A force feedback is a particular kind that can provide force.
Theoretically, the Mantis could be built and used by anyone upwards from a secondary school student. Not only that, researcher say the Mantis can be built for 20 times less the expense of the market equivalent because it uses components, including brushless motors, that cost significantly less than high-fidelity equivalents that are often confined to research labs.
"Humans already have a great sense of touch. Mantis expands on this innate ability by enabling people to touch and feel 3D objects, adding more depth to the VR experience," says lead researcher Dr Anne Roudaut, from Bristol's Department of Computer Science.
"Imagine a user playing a game in Virtual Reality with Mantis attached to their fingers. They could then touch and feel virtual objects, thus immersing themselves both visually and physically in an alternative dimension."
Dr Roudaut and her PhD student Gareth Barnaby, are in New Orleans (19-23 October) presenting the Mantis at the User Interface Software and Technology (UIST) conference, the premier forum for innovations in human-computer interfaces that brings together people from graphical and web user interfaces, tangible and ubiquitous computing, and virtual and augmented reality.
Project Mantis is also supported by a new a spin-out venture, Senmag Robotics, which researchers hope will enable them to progress their design to market, starting with the production and testing of the first kits ready for release by the end of the year.
"We will be giving out the plans to allow anyone to build a Mantis," adds Gareth Barnaby. "Because we are keen to make force feedback devices more widespread and not confined to research labs, we are also looking to produce some easy to build kits as well as pre-built versions that we will make available on the website."
This work was supported by the Engineering and Physical Sciences Research Council (EPSRC) and the Leverhulme Trust.
Story Source:
Materials provided by University of Bristol. Note: Content may be edited for style and length.
www.sciencedaily.com/releases/2019/10/191020084936.htm
|
|
|
Post by HAL on Oct 22, 2019 20:22:23 GMT
"Imagine a user playing a game in Virtual Reality with Mantis attached to their fingers. They could then touch and feel virtual objects, thus immersing themselves both visually and physically in an alternative dimension." The porn channels are going to love this. HAL
|
|
|
Post by swamprat on Nov 2, 2019 23:20:07 GMT
For The First Time Ever, Scientists Discover Fractal Patterns in a Quantum Material Mike McRae
18 Oct 2019
(Pete LinForth/Pixabay)
From tiny snowflakes to the jagged fork of a lightning bolt, it's not hard to find examples of fractals in the natural world. So it might come as a surprise that, until now, there have remained some places these endlessly repeating geometrical patterns have never been seen.
Physicists from MIT have now provided the first known example of a fractal arrangement in a quantum material.
The patterns were seen in an unexpected distribution of magnetic units called 'domains', which develop in a compound called neodymium nickel oxide - a rare earth metal with extraordinary properties.
Getting a better understanding of these domains and their patterns could potentially lead to new ways of storing and protecting digital information.
Naturally occurring fractal patterns in Romanesco broccoli (Brassica oleracea). (Photopips/iStock)
And that's pretty cool, because neodymium nickel oxide, or NdNiO3, is strange stuff.
Pull a piece out of your pocket and zap it with a current, and it'll conduct pretty easily. Drop it into liquid nitrogen so it falls below a critical temperature of around minus 123 degrees Celsius (minus 189 Fahrenheit), and it will shut up shop and become an insulator.
That's not the only thing that changes. As physicist Riccardo Comin explains, "The material is not magnetic at all temperatures."
Sure, even a common piece of magnetised iron will lose its talent for pointing north if you heat it enough, so this isn't all that strange. But neodymium nickel oxide doesn't play by the usual rules, so the precise way its electrons fall into magnetic arrangements has been a mystery.
What we do know is like most ferromagnetic materials, atoms in neodymium nickel oxide team up as tiny clumps of magnetically oriented particles called domains.
Domains come in a variety of sizes and arrangements, depending on quantum interactions between electrons and their atoms under certain conditions. But just how they emerge in neodymium nickel oxide, given its nature as a conductor moonlighting as an insulator, was the big question.
"We wanted to see how these domains pop up and grow once the magnetic phase is reached upon cooling down the material," says Comin.
Researchers have in the past scattered X-rays through the material to study its weird flip-flopping electromagnetic properties in the hopes of uncovering its electrical secrets.
While this showed how the material distributes its electrons at different temperatures, mapping the size and distribution of its domains under such conditions required a more focussed approach.
"So we adopted a special solution that allows squeezing this beam down to a very small footprint, so that we could map, point by point, the arrangement of magnetic domains in this material," says Comin.
That special solution was as old as it is novel – they used the same technology many old fashioned lighthouses employ to channel light into a tight beam.
Fresnel lenses are stacked layers of a transparent material with ridges that redirect electromagnetic radiation. While the lenses in lighthouses can be metres across, the ones Comin and his team developed were just 150 microns wide.
The end result was an X-ray beam small enough to detect the fine scale of magnetic domains across a thin film of lab-grown neodymium nickel oxide.
Most of those domains were tiny. Scattered among them were some bigger ones. But once the numbers were crunched and a map drawn, the distribution of bigger domains among a sea of tiny ones looked eerily similar no matter what scale you were using.
"The domain pattern was hard to decipher at first, but after analysing the statistics of domain distribution, we realised it had a fractal behaviour," says Comin.
"It was completely unexpected – it was serendipity."
Materials that can act both as a conductor and insulator already play a big role in the world of electronics. Transistors are based on this very principle.
But neodymium nickel oxide has another trick up its sleeve. The same fractal pattern of domains reappears when the temperature drops again, almost as if it has some kind of memory on where to redraw its borders.
"Similar to magnetic disks in spinning hard drives, one can envision storing bits of information in these magnetic domains," says Comin.
From resilient memory storage devices to artificial neurons, neodymium nickel oxide is sure to be part of the big picture of future electronics.
This research was published in Nature Communications.
www.sciencealert.com/for-the-first-time-scientists-have-discovered-fractal-patterns-in-a-quantum-material
|
|
|
Post by swamprat on Nov 25, 2019 15:47:05 GMT
CERN expected to announce one-year delay to Large Hadron Collider upgrade 25 Nov 2019 Michael Banks
Tunnel vision: Work began on the SwFr1.5bn (£1.1bn) High Luminosity Large Hadron Collider (HL-LHC) last year. (courtesy: CERN/ Julien Marius Ordan)
CERN is expected to announce a delay to a major upgrade of the lab’s Large Hadron Collider (LHC) at a meeting at CERN tomorrow. Work began on the SwFr1.5bn (£1.1bn) High Luminosity Large Hadron Collider (HL-LHC) last year, with the revamped machine originally set to switch on in 2026. Physics World understands that a one-year delay is expected to be agreed so that the lab can plug a gap of around £100m that was expected to be contributed to the HL-LHC by non-member countries. The upgraded facility may not now start until 2028.
The HL-LHC upgrade is designed to increase the collider’s luminosity increase by a factor of 10 over the original machine. This requires a significant modification to the beam line around the two largest LHC detectors – ATLAS and CMS. The work will involve upgrading about 1.2 km of the 27 km ring by including 11-12 T superconducting magnets and superconducting “crab” cavities — that reduce the angle at which the bunches cross — to increase the number of collisions at the two detectors. The upgrade also involves modifications to the LHC’s detector so that it can handle the increased luminosity.
Down for longer
Work on the HL-LHC began in 2018 during “long shutdown 2”, which will last until 2021 and will see the completion of most of the civil construction for the new machine. The LHC will then run at 14 TeV for three years before being switched off again for the components of the HL-LHC to be installed during “long shutdown 3”. This was due to begin in 2024 and be complete in mid-2026 after which the HL-LHC would have a month of commissioning before physics begins at the end of that year.
However, Physics World understands that CERN will now have to contribute around £100m more towards the upgrade, which was expected to come from other non-member countries. This move could lead to a delay to the start of long shutdown 3, which is now expected to begin in 2025 and potentially last for three years – rather than 30 months as planned. In this case, long shutdown three would finish at the end of 2027 with physics on the HL-LHC not beginning until early 2028. This potential schedule change was also included in slides by the Columbia University particle physicist Gustaaf Brooijmans at a US high-energy-physics advisory panel meeting in Bethesda, Maryland, last week.
A decision to delay the HL-LHC is expected to be announced following a meeting at CERN tomorrow.
physicsworld.com/a/cern-expected-to-announce-one-year-delay-to-large-hadron-collider-upgrade/
|
|
|
Post by swamprat on Nov 29, 2019 17:37:42 GMT
US MILITARY WARNS OF “AUGMENTED HUMAN BEINGS” November 27th, 2019 | Dan Robitzski | Filed Under: Robots & Machines
Battlebots
The U.S. military has ambitious plans to turn its soldiers into high-tech cyborg warriors by making them stronger, enhancing their senses, and wiring their brains to computers.
Pentagon brass thinks these cyborgs will make their way to the battlefield by 2050, army times reports. The department of defense just declassified a report from October that details its plans for “human/machine fusion,” revealing its bizarre plan to bring to life military tech that’s always been safely quarantined within the realm of science fiction.
Bleeding Edge
The report’s executive summary identifies four key upgrades it hopes to develop over the next three decades. Two include enhancing soldiers’ eyesight and hearing. the military also wants to make soldiers stronger by equipping them with new wearables.
According to the report, all three of these will “offer the potential to incrementally enhance performance beyond the normal human baseline.”
Top Priority
What has the military really excited, however, is the fourth category: “direct neural enhancement of the human brain for two-way data transfer.” In other words, connecting soldier’s minds to computers so that military leaders could instantaneously transfer new information, but also to let soldiers control pilotless vehicles with their thoughts.
Troubling, however, is the report’s predicted aftermath: “introduction of augmented human beings into the general population, DOD active duty personnel, and near-peer competitors will accelerate in the years following 2050 and will lead to imbalances, inequalities, and inequities in established legal, security, and ethical frameworks.”
futurism.com/the-byte/us-military-augmented-human-beings?fbclid=IwAR3e0fY9Ifd9IMkfQbT88gZcC9s35lP-MiGkx6AeVjbSWUoLuSOM2QpKmxQ
|
|
|
Post by swamprat on Dec 10, 2019 16:24:07 GMT
Squeezing More from Gravitational-Wave Detectors December 5, 2019 | Physics
by Phillip Ball, M. Tse, et al
New hardware installed in current gravitational-wave detectors uses quantum effects to boost sensitivity and increase the event detection rate by as much as 50%.
Arms length. Gravitational-wave detectors work by splitting a beam from a main laser (bottom cylinder) into two perpendicular arms having mirrors at each end. The light from the two arms recombines at the detector (right side), producing an interference pattern that can reveal a passing gravitational wave. To improve the sensitivity, researchers have added so-called “squeezed” light to the main laser light (not shown). The resulting beam has less quantum noise. Source: T. Pyle/LIGO
Since 2015, gravitational-wave detections have become routine in the two US-based Advanced LIGO instruments and in the Virgo detector in Italy, opening a new window in astronomy. The LIGO and Virgo collaborations have now demonstrated—in separate papers—a modification to their detectors that uses quantum physics to suppress random noise in the signal. The scheme improves the sensitivity of both instruments, which will boost the expected rate of detections by 20 to 50%.
Advanced LIGO and Virgo use interference of laser light bouncing back and forth along two perpendicular arms, 3–4 km long, to detect the spacetime ripples from a passing gravitational wave. The detector sensitivity—which corresponds to space distortions of close to 10−2010−20 m—is limited by the effects of so-called quantum noise in the photons. Each photon in the light beam experiences quantum fluctuations, which affects its time of arrival after a round trip along the arms. “The photons arrive ‘on time’ at the detector on average, but some are very early and some are very late, forming a wide bell curve,” says Maggie Tse, a graduate student at the Massachusetts Institute of Technology (MIT), Cambridge, and a LIGO team member. The detectors are only sensitive to a gravitational wave if it changes the travel time in one arm by more than the width of this bell curve.
The LIGO and Virgo teams have reduced this noise using quantum squeezing—an idea first suggested nearly 40 years ago by quantum physicist Carlton Caves. Quantum squeezing makes the arrival-time bell curve narrower, so that the photon fluctuations mask fewer of the gravitational-wave signals. A few prototype demonstrations have previously shown that squeezing can reduce noise in gravitational-wave detection, and it has been used for several years at the GEO600 detector operated by the Albert Einstein Institute (AEI) in Germany.
Squeezing in. Researchers install the quantum squeezing optical circuitry in one of the LIGO detectors. Source: LIGO
To realize squeezing for the Advanced LIGO and Virgo projects, the teams incorporated several lessons from those earlier experiments. In both detectors, the heart of the squeezer is an optical parametric oscillator. This device produces pairs of photons that are entangled so that when one is “early” by a certain amount, the other is “late” by almost the same amount. These correlated photons are injected into the path of the main laser beam, after which the merged light is split in two, with each of the resulting beams being sent along one arm of the device. Once prepared in this way, the beams in the two arms are correlated, and their noise is reduced. “The underlying physical process used to generate the squeezed states is the same for Virgo and LIGO,” says Henning Vahlbruch, a member of the AEI team that worked with the Virgo Collaboration.
One challenge is that squeezing is no free lunch. Because of Heisenberg’s uncertainty principle, “when we squeeze the distribution of photon arrival times, something else must be becoming more uncertain,” says Tse. In this case, the number of photons striking the mirror at a given time becomes more random, leading to an increase in so-called quantum radiation pressure noise. The tradeoff is worth it for now, says Tse, because the radiation pressure effect is only noticeable at low frequencies. Nevertheless, this other source of noise can also limit the instrument sensitivity, which is why the LIGO researchers are now working on building a low-frequency filter, which they expect to be installed at the LIGO sites in the next few years. A similar modification is planned for Virgo.
For now, squeezing enhanced the sensitivity of the instruments significantly during the third observing run, which started in April of this year. “We can now detect sources that are about 15% further away for the average binary neutron star system,” says LIGO team member Lisa Barsotti from MIT. This means that “the number of sources we expect to detect has increased by about 50%.” The Advanced Virgo detector, meanwhile, increased its range by a little less: about 5–8% for binary neutron stars, giving a detection-rate boost of 16–26%.
Barsotti adds that, thanks to such improvements, the next major LIGO upgrade should increase the detection rate by more than a factor of 5. Moreover, squeezing specifically boosts the sensitivity to high-frequency gravitational waves, which should help in pinpointing the location of wave sources in the sky. This directional information allows astronomers to do follow-up observations in which they look for electromagnetic signals from events that produce gravitational waves.
Gravitational astrophysicist Kirk McKenzie of the Australian National University in Canberra says the results represent “a new era for gravitational-wave detectors.” Such quantum engineering, he says, has now “crossed from being a theoretical improvement to a tool to detect black-hole mergers further out in the universe than ever before.”
Astrophysicist Katerina Chatziioannou of the Flatiron Institute in New York says the improvements made by the two groups might enable us to see gravitational-wave signals from entirely new types of sources, such as supernovae or remnant stars created in binary neutron star collisions.
This research is published in Physical Review Letters.
physics.aps.org/articles/v12/139?utm_campaign=weekly&utm_medium=email&utm_source=emailalert
|
|
|
Post by swamprat on Dec 11, 2019 15:37:47 GMT
Lighting up cardiovascular problems using nanoparticles
New nanotechnology will allow detection of blocked arteries more effectively than ever before
Date: December 9, 2019
Source: University of Southern California
Summary:
A new nanoparticle innovation that detects unstable calcifications that can trigger heart attacks and strokes may allow doctors to pinpoint when plaque on the walls of blood vessels becomes dangerous.
Heart disease and stroke are the world's two most deadly diseases, causing over 15 million deaths in 2016 according to the World Health Organization. A key underlying factor in both of these global health crises is the common condition, atherosclerosis, or the build-up of fatty deposits, inflammation and plaque on the walls of blood vessels. By the age of 40, around half of us will have this condition, many without symptoms.
A new nanoparticle innovation from researchers in USC Viterbi's Department of Biomedical Engineering may allow doctors to pinpoint when plaque becomes dangerous by detecting unstable calcifications that can trigger heart attacks and strokes.
The research -- from Ph.D. student Deborah Chin under the supervision of Eun Ji Chung, the Dr. Karl Jacob Jr. and Karl Jacob III Early-Career Chair, in collaboration with Gregory Magee, assistant professor of clinical surgery from Keck School of Medicine of USC -- was published in the Royal Society of Chemistry's Journal of Materials Chemistry B.
When atherosclerosis occurs in coronary arteries, blockages due to plaque or calcification-induced ruptures can lead to a clot, cutting blood flow to the heart, which is the cause of most heart attacks. When the condition occurs in the vessels leading to the brain, it can cause a stroke.
"An artery doesn't need to be 80 percent blocked to be dangerous. An artery with 45% blockage by plaques could be more rupture-prone," Chung said. "Just because it's a big plaque doesn't necessarily mean it's an unstable plaque."
Chung said that when small calcium deposits, called microcalcifications, form within arterial plaques, the plaque can become rupture prone.
However, identifying whether blood vessel calcification is unstable and likely to rupture is particularly difficult using traditional CT and MRI scanning methods, or angiography, which has other risks.
"Angiography requires the use of catheters that are invasive and have inherent risks of tissue damage," said Chin, the lead author. "CT scans on the other hand, involve ionizing radiation which can cause other detrimental effects to tissue."
Chung said that the resolution limitations of traditional imaging offers doctors a "bird's eye view" of larger-sized calcification, which may not necessarily be dangerous. "If the calcification is on the micro scale, it can be harder to pick out," she said.
The research team developed a nanoparticle, known as a micelle, which attaches itself and lights up calcification to make it easier for smaller blockages that are prone to rupture to be seen during imaging.
Chin said the micelles are able to specifically target hydroxyapatite, a unique form of calcium present in arteries and atherosclerotic plaques.
"Our micelle nanoparticles demonstrate minimal toxicity to cells and tissue and are highly specific to hydroxyapatite calcifications," Chin said. "Thus, this minimizes the uncertainty in identifying harmful vascular calcifications."
The team has tested their nanoparticle on calcified cells in a dish, within a mouse model of atherosclerosis, as well as using patient-derived artery samples provided by vascular surgeon, Magee, which shows their applicability not only in small animals but in human tissues.
"In our case, we demonstrated that our nanoparticle binds to calcification in the most commonly used mouse model for atherosclerosis and also works in calcified vascular tissue derived from patients," Chin said.
Chung said that the next step for the team was to harness the micelle particles to be used in targeted drug therapy to treat calcification in arteries, rather than just as means of detecting the potential blockages.
"The idea behind nanoparticles and nanomedicine is that it can be a carrier like the Amazon carrier system, shuttling drugs right to a specific address or location in the body, and not to places that you don't want it to go to," Chung said.
"Hopefully that can allow for lower dosages, but high efficacy at the disease site without hurting normal cells and organ processes," she said.
Story Source:
Materials provided by University of Southern California. Original written by Greta Harrison. Note: Content may be edited for style and length.
Journal Reference:
Deborah D. Chin, Jonathan Wang, Margot Mel de Fontenay, Anastasia Plotkin, Gregory A. Magee, Eun Ji Chung. Hydroxyapatite-binding micelles for the detection of vascular calcification in atherosclerosis. Journal of Materials Chemistry B, 2019; 7 (41): 6449 DOI: 10.1039/C9TB01918A
|
|
|
Post by swamprat on Dec 12, 2019 19:34:55 GMT
...and so it continues.....These 10 countries top the ranks in chemistry research Where the best chemistry takes place.
12 December 2019
Gemma Conroy
Aurelien Bancaud, an award-winning chemist from the French National Centre for Scientific Research (CNRS). CNRS is one of the most prolific institutions in chemistry in the Nature Index. ERIC CABANIS/AFP via Getty Image
For the first time, China has taken the Nature Index crown as the biggest producer of high-quality research in chemistry, knocking the United States down to second place.
China’s chemistry output has grown by 17.9% since 2017, to achieve an impressive Share of 6,183.75 in 2018. Its output is almost double the collective Share of its Asian neighbours in the top 10: Japan, South Korea, and India.
Share, formerly referred to in the Nature Index as Fractional Count (FC), is a measure of an institution’s contribution to articles in the 82 journals tracked by the index.
After taking the top spot in chemistry for three years in a row, the US fell behind China in 2018 with a Share of 5,371.32, representing a 6.2% drop from the previous year.
While the eight other nations in the top 10 have maintained their places since 2017, all except Spain have seen declines in their chemistry output.
Japan is the fourth most prolific country in high-quality chemistry publishing, with a Share of 1,388.14. But it had the largest decrease in output among the top 10 countries between 2017 and 2018, dropping by 12.6%.
The United Kingdom took the fifth spot, with a Share of 1,023.58 – a 10.8% decrease since 2017.
Holding its own in tenth place, Spain showed signs of growth in chemistry research publishing, with its Share rising by 1.3% between 2017 and 2018.
Below are the top 10 countries in chemistry in the Nature Index.
www.natureindex.com/news-blog/these-ten-countries-top-the-ranks-in-chemistry-research
|
|
|
Post by HAL on Dec 12, 2019 22:13:22 GMT
Surprised not to see Israel on the list.
|
|
|
Post by swamprat on Dec 16, 2019 16:40:41 GMT
Physics drives ongoing developments in proton therapy 16 Dec 2019 | Tami Freeman
The ProBE linac will accelerate protons from a medical cyclotron to the higher energies required for proton imaging. (Courtesy: Joel Sauza Bedolla)
The recent meeting, Physics-based Contributions to New Medical Techniques, examined how physics technologies are employed to help develop a diverse range of medical applications. One area in particular in which physics has played a vital role is the evolution of particle therapy systems and techniques.
Hywel Owen from the University of Manchester gave meeting attendees an introduction to the UK’s two NHS-funded proton therapy centres, at The Christie in Manchester and UCLH in London. He noted that a number of university academics around the UK are collaborating with these centres to improve the science and technology of radiotherapy. The Christie, which began proton treatments at the end of last year, has also constructed a research beamline at its proton centre, at which researchers from Christie Hospital and the University of Manchester will conduct novel research.
Although the UK centres offer state-of-the-art treatments, Owen explained that there are a number of opportunities to further improve the quality of treatments. Areas in which UK researchers are working include reducing treatment times, improving the imaging and accuracy of treatment, and developing use of alternative particles such as carbon ions and electrons.
Owen described his group’s work to develop the world’s first superconducting cyclotron that operates at 70 MeV. This system aims to provide a route to higher dose rate delivery for shallow treatments such as ocular therapy and will potentially give a better dose distribution the current technologies. The Cockcroft Institute has collaborated with Antaya – one of the world’s leading cyclotron developers – to produce a prototype; the larger magnetic fields obtained with superconducting magnets allow cyclotrons to be made much smaller and cheaper.
Another novel research development is the ProBE (proton boosting extension for imaging and therapy) linac – a joint project between the Cockcroft Institute, The Christie and CERN. ProBE is designed to accelerate protons from a medical cyclotron to the higher energies required for proton imaging. Owen explained that a prototype cavity has been manufactured and is predicted to achieve a gradient of about 54 MV/m. By adding it to a proton therapy centre’s beam transport system, whole-body proton imaging of adults becomes possible.
Rapid QA frees up treatment time
The big advantage of proton therapy arises from the fact that protons deposit most of their energy at a specific depth – the Bragg peak – and then stop, sparing surrounding normal tissue. But as Simon Jolly from University College London (UCL) explained, this highly localized dose deposition is also a disadvantage, as any range uncertainties necessitate the use of margins around the target volume. Effective quality assurance (QA) of proton therapy set-ups is thus essential to exploit the full dosimetric benefit of proton therapy. Unfortunately, such procedures can be time consuming.
Jolly described a prototype range measurement device under development at UCL that should enable faster and more accurate proton range measurements, thereby speeding up the daily QA process. “We are transferring technology from pure high-energy physics research to proton therapy,” he told the audience.
The prototype proton range measurement system. (Courtesy: Simon Jolly)
Multilayer ionization chambers (MLICs) can perform beam range measurements in just a few minutes, but can be bulky and expensive. The UCL device is similar to an MLIC design, but replaces the stack of ionization chambers with individual sheets of plastic scintillator, of the type used in the SuperNEMO double-β decay experiment. Jolly notes that this lightweight plastic is near-water-equivalent, provides high light output and has excellent energy resolution. Unlike MLICs, it is also capable of making measurements at FLASH dose rates.
To measure range, the proton beam is fired horizontally into the end of the stack of scintillator sheets. The device reads out the light signal from each individual sheet using a pixelated sensor placed on top of the stack (over the sheet edges). Beam range can then be estimated from the measured light dose distribution. The system is calibrated by shooting a high-energy proton beam through the entire stack in both directions.
Jolly and colleagues tested the prototype device at several sites, including MedAustron, the Heidelberg Ion Beam Therapy Center and the Birmingham Cyclotron, where it performed its first Bragg peak measurement last March. The system demonstrated a proton range reconstruction accuracy of about 100 µm, well below the clinical requirement of 1 mm.
The team also verified the radiation hardness of the prototype system by performing a “fry up”, at the Clatterbridge Cancer Centre. After continuous irradiation for an entire day, the device showed less than 5% reduction in peak light output and no change in range accuracy. “The detector survived almost 6500 Gy, about a year’s worth of dose,” Jolly noted. “The next step is to build a system for the clinic that is self-contained, easy to use and robust.”
physicsworld.com/a/physics-drives-ongoing-developments-in-proton-therapy/
|
|
|
Post by swamprat on Dec 21, 2019 16:51:39 GMT
China Could Be Turning on Its 'Artificial Sun' Fusion Reactor Really Soon Kristin Hauser, Futurism
19 Dec 2019
(sakkmesterke/iStock)
In March, Chinese researchers predicted that the nation's HL-2M tokamak - a device designed to replicate nuclear fusion, the same reaction that powers the Sun - would be built before the end of 2019.
No word yet on whether that's still the case, but in November, Duan Xuru, one of the scientists working on the "artificial sun," did provide an update, saying that construction was going smoothly and that the device should be operational in 2020 - a milestone that experts now tell Newsweek could finally make nuclear fusion a viable energy option on Earth.
If scientists can figure out how to harness the power produced by nuclear fusion, it could provide a near-limitless source of clean energy.
For decades, that's made fusion power a holy grail for energy researchers.
But the problem is that they've yet to figure out a cost-effective way to keep extremely hot plasma confined and stable long enough for fusion to take place.
China's HL-2M tokamak might be the device that's finally up to that challenge - or at least yields the clues needed to overcome it.
"HL-2M will provide researchers with valuable data on the compatibility of high-performance fusion plasmas with approaches to more effectively handle the heat and particles exhausted from the core of the device," fusion physicist James Harrison, who isn't involved with the project, told Newsweek.
"This is one of the biggest issues facing the development of a commercial fusion reactor," he continued, "and the results from HL-2M, as part of the international fusion research community, will influence the design of these reactors."
www.sciencealert.com/china-s-promising-artificial-sun-fusion-reactor-could-be-operation-next-year?fbclid=IwAR0q_PTcgrORUdsc7Ku6Lkx8pwAcAYlNrk-ZVo_Lna01kCT8-xJ2X9YGHXc
|
|