Monday, November 18, 2013
Breaking News as Native Transmedia Journalism
As many as twenty bullet holes riddle the entryway of the New Life church in Colorado Springs, Monday, Dec. 10, 2007, where a day earlier a gunman entered the building. Two are dead in addition to the gunman and another two are injured in the second shooting to hit a Colorado religious organization in a day. The gunman in the Colorado Springs shooting was killed by a church security guard. Two are dead in a possibly related shootings at a dormitory for missionaries in Arvada, Colo., a suburb of Denver. (Kevin Moloney for the New York Times)
Anyone who follows any news in the 21st-century mediascape has experienced this native and uncoordinated form of transmedia journalism first hand. Here in Colorado last month we suffered massive and destructive flooding. The story is still unfolding and the aftermath will endure for months more. When the news struck that local mountain streams would surpass 100-year flood levels, I, my friends and colleagues dove headlong into a diverse array of media forms and channels to digest the news. I turned on the local TV broadcasts, I listened actively to local public radio, I watched Twitter hashtags, Facebook posts, Instagram feeds, awaited SMS texts from the university and picked up the phone to talk to friends and relatives. I didn’t get the story from one place — multiple devices and technologies of all ages were used. I didn’t get it in any one media form — the story came as text, video, audio, conversation and even in the clouds outside my window. I absorbed complete stories from multiple sources and sewed them into a larger and more complex picture of what was happening than I could of had I depended on only one of them. This applies to other breaking stories, from the Navy Yard shootings to the Boston Marathon bombings to Sandy Hook Elementary. Once engaged with a story that demands fast attention, we immerse ourselves in multiple spaces in the mediascape — online and off — to gather the complete and current picture. This is not a planned and curated form of transmedia journalism. It is an emergent form created by each individual as he or she engages with the story. It illustrates the idea that we can engage with multiple characters across multiple stories in multiple places to achieve what game designer Neil Young calls “additive comprehension.” We are deeply engaged when rapidly moving events raise cultural, civil or environmental concerns, or has an immediate impact on our lives. A drive to know more, see more and stay up-to-date leads us naturally to transmedial consumption of news. But what about the stories that don’t scream for immediate attention to any and every media form and channel available? Here, as we do for traditional news stories, we depend on style, human connection and compelling narratives to draw a public. We can carry those techniques to predesigned transmedia narratives so that, once engaged, the public has somewhere to find more. Through transmedia implementation we we also open many more access points for the public to find our story.
Thursday, July 18, 2013
ATLAS Ph.D. student discusses PartoPen technology
PartoPen shown with partograph paper form in the background. |
Underwood demonstrates the functionality of the PartoPen while a nurse midwife monitors a laboring patient and members of Kenya's media look on. |
Wednesday, June 26, 2013
James McCartney |
Tuesday, May 21, 2013
In real time, the Microsoft Kinect system scans the movements of students while software programmed by a student team places their silhouettes in an animated graphic background. |
Behnke explained, “This year, I helped an art teacher bring computer science to students at Skyline High School in Longmont. It was a very gratifying experience!”
Interactive installation designed by students under Behnke’s direction allows participants to see their image instantly transformed through the creative use of the Microsoft Kinect game controller. |
Skyline High School student presents a set design he created in which the multimedia, interactive display and performance will be presented. |
The writer, Ira G. Liss, is assistant communications director at ATLAS Institute and a performing artist. See video of his original commentary, songs and spoken word here.
Monday, May 6, 2013
Exploring 3D Tactile Technologies for Visually Impaired Children
2D to 3D Software Interface
Photo at right: Five textured layouts were fabricated by computer science graduate student Jeeeun Kim. Each one could potentially be used as materials for tactile storybooks. Clockwise from left: Plastic yellow layout shows a tactile illustration of a room interior, formed by a 3D printer; three executions of a diagram of an egg uses three materials: 1. Cut and folded blue paper; 2. Multi-layered laser-cut wood; 3. Etched red plastic; sitting to the right of a pen, a transparent plastic sheet becomes a base for raised forms that illustrate a room interior, made from a glue gun.
Several faculty members serve as principal investigators and senior advisors. All contribute to moving the research forward. One of Stangl’s computer science professors, Tom Yeh, proposed the project and ignited her curiosity about tactile perception and interface design for children with visual impairments.
Photo at left: At $2,000, this MakerBot 3D printer represents a price breakthrough that could make it possible for parents of visually impaired children to print their own tactile learning aids at home.
The writer of this article, Ira Liss, is ATLAS Institute's assistant director of communications and also a pianist, singer/songwriter and performing artist. See videos of his original work. Contact him here.
Wednesday, May 1, 2013
Application Developers Alliance and the University of Colorado Boulder’s ATLAS Institute Team Up To Advance App Innovation Research
Washington, D.C. (May 1, 2013) - The Application Developers Alliance has established the Alliance Research Fellowship at the University of Colorado Boulder’s ATLAS Institute. The goal of the fellowship is to further understanding of the dynamics at play in the app development ecosystem. The ATLAS Institute has appointed Ph.D. student Sid Saleh the inaugural Application Developers Alliance Research Fellow.
“The Alliance is thrilled to partner with the ATLAS Institute to provide first-of-its kind insight into the developer community and innovation in the apps industry,” said Jon Potter, President of the Application Developers Alliance. “This allows us to tailor our programs and advocacy efforts to meet the needs of developers. Furthermore, our ground-breaking research will give policymakers insight into the work developers are doing, their contributions to our economy, and the challenges they face.”
Saleh has launched two research inquiries. The first is an ongoing poll of application developers, aiming to accurately describe the developer community. The survey includes the number and different types of developers, the kind of work they are doing, the specific challenges they face, and how these are evolving over time. The second study was launched in conjunction with the Alliance’s Developer Patent Summits and looks into developers’ experiences with the software patent system.
“We are honored to have the Application Developers Alliance associated with the work of ATLAS through support of our Technology, Media and Society Ph.D. program,” said John Bennett, Director of the ATLAS Institute and Archuleta Professor of Computer Science at the University of Colorado Boulder. “The ATLAS Institute’s focus on interdisciplinary, technology-related research makes us particularly qualified to partner with the Alliance in exploring a wide variety of issues that matter to the developer community. Our scholarship incorporates real-world data and practical challenges. We expect the research the Alliance Fellow conducts will provide unique insight into an industry that has such an impact on global digital society.”
About the Application Developers Alliance
The Application Developers Alliance is an industry association dedicated to meeting the unique needs of application developers as creators, innovators, and entrepreneurs. Alliance members include more than 20,000 individual application developers and more than 100 companies, investors, and stakeholders in the apps ecosystem.
About the ATLAS Institute, University of Colorado Boulder A campus-wide initiative, the ATLAS Institute leads discovery and innovation at the intersection of technology and society. We seek to better understand the interaction of people with information and communication technology (ICT), and to realize the full potential of that interaction. ATLAS interdisciplinary programs help develop creative designers, critical thinkers, effective leaders, capable learners, transdisciplinary innovators and engaged global citizens. Contact: Courtney Lamie, Application Developers Alliance 202-250-3006 (desk) Bruce Henderson, ATLAS Institute 303-735-0899 (desk)
Wednesday, April 24, 2013
International multimedia artist Nicholas Jaar speaks at ATLAS at 6 p.m. this Friday, April 26, as part of two free ATLAS events
6 p.m. this Friday, April 26, ATLAS 100, Cofrin Auditorium, first floor of the Roser ATLAS Building
2. Masaki Batoh – The ATLAS Center for Media, Arts and Performance and the Communikey Festival of Electronic Arts present Masaki Batoh and his performance of Brain Pulse Music, which features music based on brain waves.
7:45 p.m. this Friday, April 26, ATLAS Black Box theater, downstairs, lowest basement level B2, Roser ATLAS Building
Nicholas Jaar is founder of the record label Clown & Sunset and released his debut album "Space is Only Noise" in 2011. Record label colleagues plus emerging Colorado-based music producers will join him for a Q&A discussion.
Jaar has performed around the world, including an acclaimed five-hour improvisation at NYC’s Museum of Modern Art that he will perform again during the Communikey Festival.
Masaki Batoh's Brain Pulse Music is the result of research into the bioelectric functions of the human brain combined with the traumatic aftermath of Japan’s Great East Earthquake. When first conceived, it was to realize music from extracted brain waves.
Both events are free and open to the public.
Audiences are advised to arrive 10 minutes early.
Seating is limited and first-come, first-served.
The ATLAS Speaker Series is made possible by a generous donation by Idit Harel Caperton and Anat Harel.
Tuesday, March 19, 2013
Live and in Concert:
New Sound Controller Technologies
Enhance and Create Music
Following its mission of applying technology to a wide range of disciplines - including the performing arts - CU's ATLAS Institute recently presented performances that explored the use of controller technologies to produce sound and music.
On March 2, 2013, computer science Ph.D. student Charles Dietrich collaborated with the Boulder Laptop Orchestra (BLOrk), the ensemble-in-residence of the ATLAS Center for Media, Arts and Performance, in the creation of a piece called “Gestures.” In this piece, three music students moved hands and fingers in front of 3D cameras that were able to recognize the three-dimensional positions of the center of the hand plus fingers.
Three students control sound through the movement of their hands which are tracked by 3-D Intel cameras. Images projected on the screen behind them show how the camera's software is able to diagram the movement of their hands by tracking the center of their palms and positions of their fingers.
A Gesture-Based Music Controller
"Our hands are so expressive," Dietrich said. "A technology that can track and record their subtle movements can also help us develop new applications – in music, sculpture, visual arts and CAD programs. Those who study the body language of various cultures may have new tools for research as well."
The hands of doctoral student Charles Dietrich are read by an Intel 3D camera mounted to the top edge of his laptop. Inset image shows how his hands are interpreted by the software. Brightly colored lines are superimposed over his palm and fingers and change angle and shape as he moves. In this way, the software can detect whether hands are open or closed or shaped in a way that might trigger a change in the sound specified by the programmer/developer/musician.
The camera incorporated a depth camera and a standard color camera. The depth camera used infrared light and an infrared detector to determine depth by time-of-flight (ToF) of the light, similar to radar, but with light. The depth camera and color camera data were processed through Intel software to determine the hand landmarks.
Dietrich used data from the Intel-developed camera software to assign sounds to hand movements. For example, the altitude of the left hand determined note or pitch. Opening or closing the right hand determined whether a note was played. The altitude of the right hand controlled loudness. In this way, Dietrich’s gesture-based controller functioned as a musical instrument.
13 Pedals, No Hands
Another controller used at the same concert was developed and played by College of Music Ph.D. student Hunter Ewen. Dubbed the “SCREAMboard” (an acronym of “soloist-controlled, real-time, electronic audio-manipulation board”), the foot-operated musical interface was used for real-time composition and improvisation.
Doctoral student Hunter Ewen presents his foot-controlled controller that allows hands-free control of 13 sound parameters. “My goal is to have the controller allow electro-acoustic musicians a hands-free way to control all the special effects and sound features they could want. Achieving this with earlier tools has been a difficult, awkward and often expensive process,” Ewin said.
Thirteen pedals controlled recording, playing, looping (after being recorded, a sound or sound sequence is played back repeatedly as a “loop”), synchronizing loop lengths (managing the way that multiple loops fit together or align with one another), panning (controlling the way sound is thrown around a room given its arrangement of speakers) and volume.
In one piece, Ewen (shown at left and below at right) used his voice, a kazoo and several unconventional instruments to record and loop a variety of sounds that he layered into complex textures.
The colorful diagram of vertical bars projected on the screen behind the performers shows the status of the 13 different controller pedals.
In another piece, John Gunther (shown at left), associate professor of jazz studies and co-director of BLOrk, performed a solo piece using Ewen’s interface. He played his jazz flute, recording and looping melodic sequences that harmonized with one another to become a rich background upon which he improvised a solo.
From One Mouse to Many
John Drumheller, a CU College of Music instructor, director of music technology and BLOrk co-director, talked about controllers. “In the past we were limited to one slider or one controller when we used a computer and a mouse. We could control only one aspect of sound with that mouse. But now, with the introduction of the tablet, smart phones and touch screens, each finger can control a completely different sound or quality of sound.”
BLOrk co-director John Drumheller performs on a tablet during a December 2012 concert in the ATLAS Black Box theater.
It is now possible for our 10 fingers to control 10 sounds or aspects of sound (pitch, volume, rhythms, harmonies, etc.) simultaneously. Of course for centuries, the piano has allowed our 10 fingers to make 10 different sounds at the same time, but we are talking about the ability to assign a different sound, instrument, rhythm or dynamic to each finger. (This is similar to Hunter Ewen allowing us to assign and control 13 different sounds or parameters using his foot pedal control device.)
Four Controllers, One Touch Screen
In the music lab, Drumheller showed me his tablet for a moment. He set its screen up with four rectangles of different colors. As I touched the screen, I let my fingers slide around the surface, tap and wiggle. Instantly (much to my delight!), four different exuberant sounds were generated simultaneously as my fingers moved.
Soon, as Dietrich’s gesture-based technology progresses, we will be able to move hands and fingers in the air around us, unconfined by screen, tablet or keyboard. Our movements will instantly become sound. Dance and music, always complementary art forms, may become one.
~ ~ ~ ~
See additional photos of the March 2nd concert of the Boulder
Laptop Orchestra (BLOrk) that took place in the downstairs
ATLAS Black Box theater.
Get yourself on the ATLAS email list. ATLAS concerts, dance/theater performances, art installations and Speaker Series talks are always free and open to the public (well, with very few, rare exceptions).
Look into a list of upcoming ATLAS events.
Perhaps there are several you'd like to attend.
Contact the writer: Ira Liss is ATLAS Institute's assistant director
of communications and also a pianist, singer/songwriter and performing artist. Enjoy videos of his original work.
Monday, February 4, 2013
The series is an educational and experiential resource for students, faculty and the larger community to discuss the challenges, opportunities and innovative applications of technology. Talks usually run from 4-5 p.m. in the Cofrin Auditorium, ground floor, ATLAS 100 (enter from lobby), unless otherwise noted.
The following speakers have been scheduled for the spring 2013 semester:
Digital Media and Music as an Instrument for Social Change
Paul Miller (aka DJ Spooky) will discuss his recent work exploring digital media, music and the ways that art can open minds and help people gain new perspectives on issues like climate change. He will perform his multimedia electronic compositions (using his own DJ app and wall to wall graphics) accompanied by CU College of Music cellist Megan Knapp and violinist Emily Lenck. Miller is a multimedia digital artist, musician, composer, remixer, author of “The Book of Ice” and the first artist-in-residence at NYC’s Metropolitan Museum of Art. His visit is a collaboration between ATLAS Institute, the Program for Writing and Rhetoric and the President's Fund for the Humanities.
6-7:30 p.m. Wednesday, Feb. 6, downstairs Black Box theater,
lowest basement level, B2
How Open Source Software is Changing Technology
Stormy Peters will look at how we can predict the future of technology by observing the use and development of such software. As examples, she’ll show how nonprofits and open source software projects set technology directions and make bold statements about how to make the world a better place. She will also discuss her work at Mozilla and share insights about the future of information technology (IT).
Stormy Peters is director of websites & developer engagement at Mozilla. She is an advocate and supporter of open source software and its potential to change the software industry. She is also the founder and vice president of Kids on Computers, a nonprofit organization that sets up computer labs in developing countries.
4-5 p.m. Monday, Feb. 18, Cofrin Auditorium, ATLAS 100
Women and the Web: Bridging the Gender Internet Gap in Developing Countries
Renee Wittemyer will discuss the data on the large Internet gender gap in developing countries and the social and economic benefits of securing Internet access for women. Recent studies find that on average – across the developing world – nearly 25 percent fewer women than men have access to the Internet and the gap soars to nearly 45 percent in sub-Saharan African regions. Wittemyer will discuss this research and a call-to-action to double the number of women and girls online in developing countries from 600 million today to 1.2 billion in 3 years.
Wittemyer, director of social impact in Intel Corporation’s Corporate Responsibility Office, develops strategies for Intel’s girls and women’s campaign and manages relationships with strategic alliances, such as USAID, NGOs, and U.N. Women.
4-5 p.m. Monday, Feb. 25, Cofrin Auditorium, ATLAS 100
Education at the Intersection of Computer Science and Music
Ge Wang will discuss the transformative possibilities of music and computing to make art, strange new instruments and connections to people around the world. His talk will explore laptop and mobile phone orchestras, computer music languages, social music apps like Ocarina and Magic Piano – all examples of an emerging, growing space where computers, music and people interact. Wang is a Stanford University assistant professor in the Center for Computer Research in Music and Acoustics (CCRMA) and researches programming languages and interactive software systems for computer-generated music.
4-5 p.m. Monday, March 18, downstairs Black Box theater, lowest basement level, B2
Learning to Love Technology by Making Arts and Crafts
Leah Buechley will discuss her work developing accessible technologies that allow people of all ages – including those who might not otherwise be attracted to computing – to sketch, design, paint and sew while incorporating electronic technology. As examples, she will show student projects that blend computing with traditional arts, crafts, textiles, paper and wood. A developer of the LilyPad Arduino toolkit, Buechley received her master’s and Ph.D. degrees in computer science from CU. She is an associate professor at MIT and directs the High-Low Tech
Media Lab.
4:15-5:15 p.m. Wednesday, April 10, Cofrin Auditorium, ATLAS 100
Mystery, Music and Digital Innovation
Nicolas Jaar, an international multi-media performing artist, will talk about his digital music work and creative process. Record label colleagues plus emerging Colorado-based music producers will join him in a Q&A discussion. Jaar is founder of the record label Clown & Sunset and released his debut album “Space Is Only Noise” in 2011. He has performed around the world including an acclaimed five-hour improvisation at NYC’s Museum of Modern Art. The talk is a collaboration between ATLAS Institute and the Communikey Festival of Electronic Arts, http://communikey.us.
6-7:30 p.m. Friday, April 26, Cofrin Auditorium, ATLAS 100