Researchers at the Korea Advanced Institute of Science and Technology
(KAIST) have developed K-Glass, a wearable, hands-free HMD that features
a built-in augmented reality (AR) processor.
Unlike virtual reality which replaces the real world with a
computer-simulated environment, AR incorporates digital data generated
by the computer into the reality of a user. With the computer-made electronics sensory inputs such as sound, video, graphics or GPS data, the user's
real and physical world becomes live and interactive. Augmentation takes
place in real-time and in semantic context with surrounding
environments.
Most commonly, location-based or computer-vision services are used in
order to generate AR effects. Location-based services activate motion
sensors to identify the user's surroundings, whereas computer-vision
uses algorithms such as facial, pattern, and optical character
recognition, or object and motion tracking to distinguish images and integrated electronics objects. Many of the current HMDs deliver augmented reality experiences
employing location-based services by scanning the markers or barcodes
printed on the back of objects. The AR system tracks the codes or
markers to identify objects and then align them with virtual reality.
However, this AR algorithm is difficult to use for the objects or spaces
which do not have barcodes, QR codes, or markers, particularly those in
outdoor environments and thus cannot be recognized.
To solve this problem, Hoi-Jun Yoo, Professor of Electrical
Engineering at KAIST and his team developed an AR chip that works just
like human vision, claiming a world first.
This processor is based on the Visual Attention Model (VAM) that
duplicates the ability of human brain to process visual data. VAM,
almost unconsciously or automatically, disentangles the most salient and
relevant information about the environment in which human vision
operates, thereby eliminating unnecessary data unless they must be
processed. In return, the processor can dramatically speed up the
computation of complex AR algorithms.
The AR processor has a data processing network similar to that of a
human brain's central nervous system. When the human brain perceives
visual data, different sets of neurons, all connected, work concurrently
on each fragment of a decision-making electronics international process; one group's work is
relayed to other group of neurons for the next round of the process,
which continues until a set of decider neurons determines the character
of the data. Likewise, the artificial neural network allows parallel
data processing, alleviating data congestion and reducing power
consumption significantly.
KAIST's AR processor, which is produced using the 65 nm (nanometers)
manufacturing process with the area of 32 mm2, delivers 1.22 TOPS
(tera-operations per second) peak performance when running at 250 MHz
and consumes 778 mW on a 1.2 V power supply. The ultra-low power
processor shows 1.57 TOPS/W high efficiency rate of energy consumption
under the real-time operation of 30fps/720p video camera, a 76%
improvement in power conservation over other devices.
http://en.ofweek.com/news/Researchers-developed-K-Glass-with-a-built-in-AR-processor-8251
没有评论:
发表评论