On Friday January 14th, 2010, I had the privilege to hop on the train, my favorite way to travel lately, and headed south to Washington, DC where I participated in the FCC’s Emergency Access Advisory Committee initial meeting.
For those of you not familiar with the EAAC, The following excerpt is from their Charter document:
The Committee’s Official Designation The official designation of this advisory committee of the Federal Communications Commission (Commission or FCC) is the “Emergency Access Advisory Committee” as prescribed by the Twenty-first Century Communications and Video Accessibility Act of 2010 (Communication and Video Accessibility Act or CVAA). The Committee’s Objective and Scope of its ActivitiesThe EAAC is hereby chartered for the purpose of implementing sections of the CVAA that pertain to making next generation emergency 911 services accessible by individuals with disabilities, as a part of the migration to a national Internet protocol-enabled emergency network (NG911).
With one of the primary goals of the committee being to enhance emergency services for individuals with disabilities, in addition to the carriers, manufacturers, and public safety, there were a number of individuals with speech or hearing disabilities represented at the meeting. The Commission did a fantastic job accommodating these individuals, providing American Sign Language interpreters and closed caption text superimposed on the video feed using some very interesting automated technology called Communication Access Realtime Translation (CART), a method of Speech-to-Text translation. I was completely awestruck with the level of interpretation provided, especially with all of the technical jargon. Besides the occasion Closed Caption mistake of “PEACE APPS” (wait for it, you’ll get the joke in a second), the transcription was right on the money.
It really brought home to me the fact that today’s antiquated and out dated 911 infrastructure was nowhere near capable of effectively communicating with individuals with disabilities in an efficient manner. On the other hand, an NG911 capable PSAP, backed by ESInet SIP connectivity could easily bring those individuals with disabilities into the modern century and away from their BAUDOT enabled TTY devices with acoustic couplers screaming away like the modems of days past. Go ahead, Google BAUDOT, then come back when you’ve stopped laughing. Do these folks deserve to be relegated to this slow outmoded technology in today’s high speed, always on world most of enjoy?
Video Killed the Radio Star One thing that really impressed me was the speed and ease of communication that the ASL interpreters provided. I had several conversations with different people that had their sign language translated to speech, and my speech translated to sign language, and I have to admit that the latency injected was minimal, and the flow of conversation was quite natural. It was then that I realized that given this natural flow of the conversation, and the ability to transfer complex thoughts and ideas via ASL, it was clear to me that this would probably be the communication method of choice for someone who was disabled.
Dr. Paul Michaelis, from Avaya Labs in Denver, mentioned another interesting point in his closing comments. He brought up the point that a person with a specific disability like being hard of hearing, for example, may prefer to communicate in a multi-modal nature. For example, they most likely would like to speak to the PSAP dispatcher, because their speech is not impaired, but receive their response via Real Time Text because of their disability. Ultimately, this places the responsibility on the PSAP to provide the appropriate translation resources. Once again, this is where NG911 comes to the rescue.
With NG911 being a network of networks, not unlike the internet, resources across a much wider geography can now be ‘pooled’ and shared where needed. For example, a small community on the outskirts of a large city may have only a single seat PSAP. Even under the best conditions, staffing that center with a call taker with every skill set would be next to impossible. But being part of the larger ESInet infrastructure, a dispatcher with a special language skill set or translation ability could be easily added to the collaborative contextual conference established for that incident.
So once again, I have proven to myself that no matter how much you think you know, there is always more to learn, and there is always another side to the story. I saw for myself the additional value NG911 could bring to the public, proving once again that we are not just developing technology for technology’s sake; we’re developing technology to save lives.
Since the EAAC meetings with be ongoing each month, and I will be attending as the alternate for Dr. Michaelis, I’ll be dedicating one blog a month to keep you in the loop on what’s happening with this important committee. I welcome your comments and questions on this, or any other topic related to E911 or NG911. You can email me directly here.
Update: February Meeting Info The second meeting of the EAAC will be held at the Commission headquarters, 445 12th St., SW, Washington, D.C. on Friday, February 11, 2011 from 10:30 a.m. to 5:00 p.m. (EST). All meetings shall be open to the public.
Mark J. Fletcher, ENP is the Chief Architect for Worldwide Public Safety Solutions at Avaya. As a seasoned professional with nearly 30 years of service, he provides the strategic roadmap and direction of Next Generation Emergency Services in both the Enterprise and Government portfolios at Avaya. In 2014, Fletcher was made a member of the NENA Institute Board in the US, and co-chair of the EENA NG112 Committee in the EU, where he provides insight to State and Federal legislators globally driving forward both innovation and compliance.