The Daily Moth 7-12-19

Labor Secretary Alex Acosta resigns; R. Kelly arrested on federal child pornography charges; Philadelphia father and mob beat carjacker to death; Pakistani politicians’ social media snafus; Michigan School for the Deaf seeking full-time principal; Deaf PhD student receives $25k research grant from Microsoft; Midsommar Open Captioned Event at Alamo Drafthouse

[Transcript] Hello, welcome to the Daily Moth! It is Friday, July 12. Ready for news?


Hello, welcome to the Daily Moth! It is Friday, July 12. Ready for news?


Labor Secretary Alex Acosta resigns

President Trump announced that Labor Secretary Alex Acosta will resign due to pressure on how he handled a child sexual abuse case involving Jeffrey Epstein when he was a federal prosecutor in Miami in 2008. The Daily Moth has explained this in detail during yesterday’s news.

Both of them appeared before reporters outside of the White House this morning.Trump claimed that he made the decision to resign. Acosta said he is leaving because he didn’t want to be a distraction from the Trump administration’s successes.

Acosta is the only Hispanic person in Trump’s cabinet and will be officially leaving in a week.


R. Kelly arrested on federal child pornography charges

Musician R. Kelly was arrested again last night in Chicago on federal charges of child pornography, enticement of a minor, and obstruction of justice. He faces additional federal charges in New York, all related with sexual abuse crimes involving children.

Media reports explain that R. Kelly is suspected of paying thousands of dollars and giving gifts to the family of a girl that appeared in a sex video with him over 10 years ago. He was acquitted in a 2008 trial.

R. Kelly was released about five months ago on bail in an Illinois sexual abuse case.

He is now in custody at a jail in Chicago. R. Kelly’s lawyers say they look forward to him having his day in court.

Many say R. Kelly won’t get off this time, as he is now in the #MeToo era with more and more women feeling emboldened to speak out about their experiences of sexual abuse.


Philadelphia father and mob beat carjacker to death

In Philadelphia, a man who stole a car with three young children inside was killed by the children’s father and other men.

Police said the car was parked at a pizza restaurant about 9 p.m. and had its engine running. The mother and the father were inside the restaurant, while the three children, a 7-month-old, a 1-year-old, and a 5-year-old, were in the car.

A 54-year-old man got into the car and drove off, but got stuck in traffic. The father ran to the car and pulled out the man. He ran away, but the father caught him and started fighting him. Other men joined in and beat him up. He was knocked out. After a while he was taken to a hospital, but was pronounced dead.

The father was the father of two of the children. So far there have been no charges against the parents.


Pakistani politicians’ social media snafus

Last week a Pakistani politician shared a video on Twitter of what he thought was an actual video of a large airplane that pulled up shortly after landing to avoid an oil tanker that was stopped in the middle of the runway.

The video was actually from a Grand Theft Auto 5 video game.

[Video clip]

The politician praised the pilot’s actions for escaping a disaster.

I’ll have to admit, the video does look realistic. But there were many who said this was clearly fake and not possible in real life.

This is the second time in a month that a Pakistani official was embarrassed on social media — last month while a Pakistani regional minister was giving a news conference on Facebook Live, a cat filter somehow came on his face. It was on his face for several minutes and hit two other officials.

So, this is a lesson that all of us can learn from — to be careful to not mistake video game clips for real life and to turn off filters while live-streaming!


Michigan School for the Deaf seeking full-time principal

The Michigan School for the Deaf has posted a job opening for a permanent, full-time principal. This process is now closely looked at by parents, students, and stakeholders of MSD, who have protested and appeared in town halls in recent weeks to criticize the school for allowing staff who are not fluent in ASL to take leadership positions.

The job posting requires someone to have at least a master’s degree in a field of education, two years of experience as a certified teacher, and has to be “proficient in American Sign Language and written English.” The salary is between $60,000 and $90,000.

Christian Young, a Daily Moth contributor, was able to collect statements from several organizations on this search.

The Michigan Deaf Association said they are committed in collaborating with the MSD, MSD Alumni Association, the NAD, the Office of Special Education throughout the process. They want someone who is culturally and linguistically appropriate.

A group called “MSD Parents for Change” said they are concerned and feels there is a lack of respect that the job posting only requires two years of teaching experience — they said it should require at least five years plus two years of administrative level experience.

Here are some brief statements from the MSD community that Young filmed.

JACOB AND ELIZA: We want to sign and chat with our principal.

SARAH HOUSTON: It’s very, very important that the principal is able to interact with every child, parent, employee or any person that comes into the principal’s office. Especially with deaf people, the principal must be able to effectively communicate with each one of them.

CHANNING: I want support for an ASL fluent principal at MSD.

JILLIAN GRUETZNER: What I’ve always wanted to see in a principal is not just having some success as a deaf school, but also an undying commitment to see this school truly thrive. That’s what MSD really needs.

CLAY PARKS: Our deaf community expects to have a principal to have meaningful experience with deaf culture and working in a deaf school.

Alex: Thank you all for sharing. The link to the application is below in the transcript.


Deaf PhD student receives $25k research grant from Microsoft

Larwan Berke is one of the 11 outstanding doctoral students to get the 2019 Microsoft Research Dissertation grant. Berke is a PhD student at Rochester Institute of Technology (RIT) who majors in computer science and is currently doing an internship in Boston.

Each dissertation grant provides up to $25,000 in funding to help doctoral students in the field of computing. This funding helps each student to advance in their doctoral thesis work.

Berke will be using this grant for his work that uses automatic speech recognition as a captioning tool to enable greater accessibility for deaf and hard of hearing users. The Daily Moth reached out to Berke for an interview.

RENCA DUNN: You’re interning in Boston so now could you explain a little more about the grant? What would it do, what would you do with that grant?

LARWAN BERKE: Okay sure, it’s a major award and I’ll expand on that. Microsoft awarded me a PhD Dissertation grant that will help me finish my work. It’s been four years into my research now, but I’m making good progress. I know I have about 1, 2 or 3 years to go and it depends. I need to study, have proper equipment and enough money. Time is also a factor when it comes to doing these things. The amount of the grant was $25,000 and it will help me buy equipment, a laptop and things that I need. I can also hire two research assistants now. They’re deaf, of course. Having them involved in the research helps me collect sufficient data to study. I’d been working with a few people gathering a small amount of data, but now I can expand my team then I can finish my PhD which I hope will take only one more year. That grant will really help me move things along. It’s a really nice boost, indeed.

RENCA: So, about your research, could you explain a little about your research.

LARWAN BERKE: Yeah, my field, if you remember, is in Computer Sciences (CS) but my specialty is in HCI, Human Computer Interaction. That means I study the technology called ASR, Automatic Speech Recognition, and use it as a captioning tool. You might have seen it before on, you know, YouTube on your phone. You can turn the captions on.

But that technology was invented by hearing people. These inventions were meant to assist hearing people. Now think about it, why don’t we have a tool meant to accommodate deaf people? Sure, this would be a good idea, but who is going to do the studying, evaluating and measuring? It’s not going to be with hearing people rather it will be those from the deaf community. That’s what my work’s all about. It’s measuring how deaf people use this technology. I’m not going to focus on the mathematical aspect, or the artificial intelligence. I’m not going to dig into the social science or social psychology. There will be overlapping factors from both fields and that’s called an HCI: Human Computer Interaction. It truly is great work and fun too because I’m interacting with deaf people and hearing people who are knowledgeable and specialize in AI. I can bring all I’ve learned into my work. This will improve the technology benefiting us because of course eventually deaf people will demand more access to information. I know interpreters can help us with this, but sometimes there are none available, sometimes they get sick or they get into a car accident. Maybe money is a factor. This could result in me not getting the information I need; I want that information. That captioning tool is one way that potentially help us all.

RENCA: You think closed captioning on television should be provided through an automated system? Or should it be a person behind a keyboard doing the captioning for our television? I’m curious about your opinion on that.

LARWAN BERKE: It’s a big challenge indeed. Of course, I would prefer getting perfect results, 100% access all day, 24/7 captioning, but in the real world, it’s not that simple. In the big cities, you know the channels like CNN or FOX, they have the funds. They’re mandated by the federal law called the CVAA or the Communication Video Accessibility Act, to provide suitable captioning. But in the smaller cities, where there’s newer channels and YouTube, there are no law requiring captions. This means I could get absolutely no captioning which is not what I want. That’s why when artificial intelligence, a robot, are utilized, I think this would be great in terms of helping us get more access to information. We have to find the right balance because the AI is used, it could provide a subpar captioning. I’ve had this discussion and I would get frustrated. I would want to give up. I would get mad. Then I’m back to getting no captioning and humans are fighting companies with money but don’t want to spend it. We need that balance and that’s why I think it’s important that we study the AI technology and how we can improve it. We would be watching, and it’s important that deaf people are a part of this, and we would approve the technology before it’s used. Imagine, maybe in sports, where their broadcasting process is relatively easier. Whether they’re scoring goals, or making baskets, they’re using the same vocabulary. That might be where the AI would be effective? Suppose in a chemistry lecture where they’re using long terminology then the AI might be no-good in this scenario. That’s where deaf people need to step in and set the boundaries. They can determine the scenarios where it would be appropriate to use the AI and identify scenarios where the AI isn’t ready to handle. You can imagine that technology will have improved a year or two by now. It’s like at each blink of eye, there’s improvement. They make newer, faster technology then I can analyze it and see if there’s been an improvement where it’s okay to use it in scenarios outside of that boundary. Like in hospitals, for example, would you be okay with being provided captions? I might not trust the technology. Even after 20 years, I probably would be resistant to it. On the other hand, the technology might be ready in 5 years. You can’t predict when that happens. It’s important that I’m able to observe and understand how to analyze and measure the data. Would it be acceptable and up to their standards? If it was a human doing the captioning, they could still make mistakes sometimes. You can see these occasional captioning mistakes. You can see the, m backspacing and correcting themselves. It’s the same idea and it’s important to have the right balance. If you can get the same results from the AI, then you can be happy with the service you’re getting. That’s our goal.

RENCA: That’s pretty cool.


Renca: Berke also encourages more deaf and hard of hearing people to be involved with STEM.

LARWAN BERKE: I’d love to promote the field of STEM here. It’s a really fun field to get into. There are some people who might not be into all that math, the coding and all these complicated computations. I know the feeling, but as you progress, the more you learn, you will find it really enjoyable. You know it’s like you’re “hacking”, you’re playing with the data and I don’t mean you’d break into the government’s firewall and steal information. That’s not what I mean at all. I mean you’d be playing with the technology. That’s what the research is about. You’d be hacking all day long. When I got the grant, we made adjustments and we got deaf lab technicians together and came up with ideas. It’s been really fun, and I want to encourage more deaf people to get into STEM so you can be the next researcher like me to win a grant like this one. Perhaps 10 years later. I’ll be there to see the next deaf grant winner. That would be pretty great and I do hope for this.


Renca: Berke explained that he wants to also build more trust between deaf and hard of hearing people and the captioning tool just like how we trust interpreters to relay accurate information. By building that trust requires more research for his work.

Thank you, Berke, for sharing. It is great to see how technology continues to rise up and to see the push for more accessibility for our community.


Midsommar Open Captioned Event at Alamo Drafthouse

ALEX ABENCHUCHAN: I’m with Chase Burton who lives here in Austin now, a part of the Austin Deaf Community. We just watched an open captioned movie called the Midsommar. Well, what do you think of that movie?

CHASE BURTON: It was a very interesting and a challenging movie! I really enjoyed the experience of having all these different emotions then releasing them at the end of the movie. Now it feels good to discuss afterwards on the panel. I really enjoyed that experience.

ALEX: The movie was like two and a half hours long, right? It’s a long film. He’s just hosted an open captioned film event for the deaf community at the Alamo Drafthouse Cinema which is a pretty neat movie theatre.

ALEX: What are your recommendations for deaf people everywhere who want to host an open captioned film event? How could they arrange something like this?

CHASE BURTON: I think it really starts with personal relationships with your local movie theatres. Often you have to find the right person and there’s one thing that I learned from this process. It really comes down to their studio because they have these strict rules about open captioning. With open captioning, the film is effectively altered changing what the audience experience watching the show. You have to find your local movie theatre and they have to obtain permission from the film’s production company in order to provide that experience to deaf people. Luckily, the Alamo already has experience with that so if you’re in a small town or something like that, you’d have to track down the manager or whoever runs the local events. Often, I do feel that, with some warmth and understanding, you’ll be able to explain why you want to have that experience. I think they would be very open to that.

ALEX: That’d be nice.

ALEX: Would you recommend Midsommar to anyone watching now?

CHASE BURTON: I would recommend it though at your own risk!

ALEX: Yeah, I agree. There’s a significant risk in watching the film. There is some very shocking imagery.


ALEX: That’s all I’m going to share.

CHASE BURTON: And thank you for joining us!

ALEX: Thank you too!


That is all for this week! Thank you for watching “The Daily Moth.”

I want to let you know that I’ll be out of the country next week because I will be going on vacation with my wife’s family in Canada. Our team will still be able to deliver news next week.

I will see you all later. Have a wonderful weekend. Stay with the light!


Supported by:

Convo []

Gallaudet University: []