|By Business Wire||
|July 2, 2014 10:25 AM EDT||
Optical imaging methods are rapidly becoming essential tools in biomedical science because they’re noninvasive, fast, cost-efficient and pose no health risks since they don't use ionizing radiation. These methods could become even more valuable if researchers could find a way for optical light to penetrate all the way through the body's tissues. With today’s technology, even passing through a fraction of an inch of skin is enough to scatter the light and scramble the image.
Now a team of researchers from Spain’s Jaume I University (UJI) and the University of València has developed a single-pixel optical system based on compressive sensing that can overcome the fundamental limitations imposed by this scattering. The work was published today in The Optical Society’s (OSA) open-access journal Optics Express.
“In the diagnostic realm within the past few years, we’ve witnessed the way optical imaging has helped clinicians detect and evaluate suspicious lesions," said Jesús Lancis, the paper’s co-author and a researcher in the Photonics Research Group at UJI. "The elephant in the room, however, is the question of the short penetration depth of light within tissue compared to ultrasound or x-ray technologies. Current knowledge is insufficient for early detection of small lesions located deeper than a millimeter beneath the surface of the mucosa."
"Our goal is to see deeper inside tissue,” he added.
To achieve this, the team used an off-the-shelf digital micromirror array from a commercial video projector to create a set of microstructured light patterns that are sequentially superimposed onto a sample. They then measure the transmitted energy with a photodetector that can sense the presence or absence of light, but has no spatial resolution. Then they apply a signal processing technique called compressive sensing, which is used to compress large data files as they are measured. This allows them to reconstruct the image.
One of the most surprising aspects of the team’s work is that they use essentially a single-pixel sensor to capture the images. While most people think that more pixels result in better image quality, there are some cases where this isn't true, Lancis said. In low-light imaging, for instance, it's better to integrate all available light into a single sensor. If the light is split into millions of pixels, each sensor receives a tiny fraction of light, creating noise and destroying the image.
“Something similar happens when you try to transmit images through scattering media,” Lancis said. “When we use a conventional digital camera to get an image, we only see the familiar noisy pattern known as ‘speckle.’ In compressive imaging, since we aren’t using pixelated sensors, it should be less sensitive to light scrambling and enable transmission of images through scattering.”
Also notable, the team’s technique could operate through dynamic scattering. “Most scattering media of interest, like biological tissue, are dynamic in the sense that the scatter centers continuously change their positions with time -- meaning that the speckle patterns are ‘in motion.’ This is ideal for some applications because monitoring the changes of the speckle can reveal information about the sample, but the drawback is that it’s a major nuisance to transmit or get images,” Lancis pointed out. “Our technique, however, requires no calibration of the medium, and its fluctuations during the sensing stage don’t limit imaging ability.”
What’s ahead for the team? “Our next goal is to break the barriers of light penetration depth inside a scattering medium with the state-of-the-art megapixel-programmable spatial light modulators used in consumer electronics,” Lancis says. To do this, they’ll need to demonstrate that their technique works even when the sample is embedded inside the tissue.
Paper: “Image transmission through dynamic scattering media by single-pixel photodetection,” E. Tajahuerce et al., Optics Express, Vol. 22, Issue 14, pp. 16945-16955 (2014).
EDITOR’S NOTE: Images and video are available to members of the media upon request. Contact Angela Stark, [email protected].
About Optics Express
Optics Express reports on new developments in all fields of optical science and technology every two weeks. The journal provides rapid publication of original, peer-reviewed papers. It is published by The Optical Society and edited by Andrew M. Weiner of Purdue University. Optics Express is an open-access journal and is available at no cost to readers online at www.OpticsInfoBase.org/OE.
Founded in 1916, The Optical Society (OSA) is the leading professional society for scientists, engineers, students and business leaders who fuel discoveries, shape real-world applications and accelerate achievements in the science of light. Through world-renowned publications, meetings and membership programs, OSA provides quality research, inspired interactions and dedicated resources for its extensive global network of professionals in optics and photonics. For more information, visit www.osa.org.
Companies can harness IoT and predictive analytics to sustain business continuity; predict and manage site performance during emergencies; minimize expensive reactive maintenance; and forecast equipment and maintenance budgets and expenditures. Providing cost-effective, uninterrupted service is challenging, particularly for organizations with geographically dispersed operations.
Feb. 14, 2016 10:00 AM EST Reads: 103
Advances in technology and ubiquitous connectivity have made the utilization of a dispersed workforce more common. Whether that remote team is located across the street or country, management styles/ approaches will have to be adjusted to accommodate this new dynamic. In his session at 17th Cloud Expo, Sagi Brody, Chief Technology Officer at Webair Internet Development Inc., focused on the challenges of managing remote teams, providing real-world examples that demonstrate what works and what do...
Feb. 14, 2016 10:00 AM EST Reads: 368
As enterprises work to take advantage of Big Data technologies, they frequently become distracted by product-level decisions. In most new Big Data builds this approach is completely counter-productive: it presupposes tools that may not be a fit for development teams, forces IT to take on the burden of evaluating and maintaining unfamiliar technology, and represents a major up-front expense. In his session at @BigDataExpo at @ThingsExpo, Andrew Warfield, CTO and Co-Founder of Coho Data, will dis...
Feb. 14, 2016 10:00 AM EST Reads: 238
When building large, cloud-based applications that operate at a high scale, it’s important to maintain a high availability and resilience to failures. In order to do that, you must be tolerant of failures, even in light of failures in other areas of your application. “Fly two mistakes high” is an old adage in the radio control airplane hobby. It means, fly high enough so that if you make a mistake, you can continue flying with room to still make mistakes. In his session at 18th Cloud Expo, Lee...
Feb. 14, 2016 09:15 AM EST
The Quantified Economy represents the total global addressable market (TAM) for IoT that, according to a recent IDC report, will grow to an unprecedented $1.3 trillion by 2019. With this the third wave of the Internet-global proliferation of connected devices, appliances and sensors is poised to take off in 2016. In his session at @ThingsExpo, David McLauchlan, CEO and co-founder of Buddy Platform, will discuss how the ability to access and analyze the massive volume of streaming data from mil...
Feb. 14, 2016 09:00 AM EST Reads: 103
Predictive analytics tools monitor, report, and troubleshoot in order to make proactive decisions about the health, performance, and utilization of storage. Most enterprises combine cloud and on-premise storage, resulting in blended environments of physical, virtual, cloud, and other platforms, which justifies more sophisticated storage analytics. In his session at 18th Cloud Expo, Peter McCallum, Vice President of Datacenter Solutions at FalconStor, will discuss using predictive analytics to ...
Feb. 14, 2016 08:45 AM EST Reads: 428
WebSocket is effectively a persistent and fat pipe that is compatible with a standard web infrastructure; a "TCP for the Web." If you think of WebSocket in this light, there are other more hugely interesting applications of WebSocket than just simply sending data to a browser. In his session at 18th Cloud Expo, Frank Greco, Director of Technology for Kaazing Corporation, will compare other modern web connectivity methods such as HTTP/2, HTTP Streaming, Server-Sent Events and new W3C event APIs ...
Feb. 14, 2016 08:30 AM EST
Join us at Cloud Expo | @ThingsExpo 2016 – June 7-9 at the Javits Center in New York City and November 1-3 at the Santa Clara Convention Center in Santa Clara, CA – and deliver your unique message in a way that is striking and unforgettable by taking advantage of SYS-CON's unmatched high-impact, result-driven event / media packages.
Feb. 14, 2016 08:30 AM EST Reads: 118
At first adopted by enterprises to consolidate physical servers, virtualization is now widely used in cloud computing to offer elasticity and scalability. On the other hand, Docker has developed a new way to handle Linux containers, inspired by version control software such as Git, which allows you to keep all development versions. In his session at 17th Cloud Expo, Dominique Rodrigues, the co-founder and CTO of Nanocloud Software, discussed how in order to also handle QEMU / KVM virtual machin...
Feb. 14, 2016 08:15 AM EST Reads: 164
SYS-CON Events announced today that FalconStor Software® Inc., a 15-year innovator of software-defined storage solutions, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. FalconStor Software®, Inc. (NASDAQ: FALC) is a leading software-defined storage company offering a converged, hardware-agnostic, software-defined storage and data services platform. Its flagship solution FreeStor®, utilizes a horizonta...
Feb. 14, 2016 07:30 AM EST
Silver Spring Networks, Inc. (NYSE: SSNI) extended its Internet of Things technology platform with performance enhancements to Gen5 – its fifth generation critical infrastructure networking platform. Already delivering nearly 23 million devices on five continents as one of the leading networking providers in the market, Silver Spring announced it is doubling the maximum speed of its Gen5 network to up to 2.4 Mbps, increasing computational performance by 10x, supporting simultaneous mesh communic...
Feb. 14, 2016 05:00 AM EST
Sensors and effectors of IoT are solving problems in new ways, but small businesses have been slow to join the quantified world. They’ll need information from IoT using applications as varied as the businesses themselves. In his session at @ThingsExpo, Roger Meike, Distinguished Engineer, Director of Technology Innovation at Intuit, showed how IoT manufacturers can use open standards, public APIs and custom apps to enable the Quantified Small Business. He used a Raspberry Pi to connect sensors...
Feb. 14, 2016 04:30 AM EST Reads: 407
The cloud promises new levels of agility and cost-savings for Big Data, data warehousing and analytics. But it’s challenging to understand all the options – from IaaS and PaaS to newer services like HaaS (Hadoop as a Service) and BDaaS (Big Data as a Service). In her session at @BigDataExpo at @ThingsExpo, Hannah Smalltree, a director at Cazena, will provide an educational overview of emerging “as-a-service” options for Big Data in the cloud. This is critical background for IT and data profes...
Feb. 14, 2016 04:00 AM EST Reads: 264
Father business cycles and digital consumers are forcing enterprises to respond faster to customer needs and competitive demands. Successful integration of DevOps and Agile development will be key for business success in today’s digital economy. In his session at DevOps Summit, Pradeep Prabhu, Co-Founder & CEO of Cloudmunch, covered the critical practices that enterprises should consider to seamlessly integrate Agile and DevOps processes, barriers to implementing this in the enterprise, and pr...
Feb. 14, 2016 04:00 AM EST Reads: 494
Eighty percent of a data scientist’s time is spent gathering and cleaning up data, and 80% of all data is unstructured and almost never analyzed. Cognitive computing, in combination with Big Data, is changing the equation by creating data reservoirs and using natural language processing to enable analysis of unstructured data sources. This is impacting every aspect of the analytics profession from how data is mined (and by whom) to how it is delivered. This is not some futuristic vision: it's ha...
Feb. 14, 2016 03:45 AM EST Reads: 484