top of page
SIX_30F2D405-61C8-43EC-9D15-A620B1D2FB28.PNG

UM Police Chief Admits to using Facial Recognition in Recently Surfaced Zoom Recording, School Continues to Deny its Use

This project is a 3-part, investigative series by team of media students at the University of Miami that seeks to identify if the University of Miami uses Facial Recognition and explore the greater impact of this technology.

This project is best viewed in the following order: 

1. The Story (video) 

2. The Facts (article) 

3. The Conversation (podcast)

Home: Welcome

The Story

This video provides background information on the incidents leading up to this investigative project.

Home: Video

The Facts

Despite Ample Evidence, the University of Miami Continues to Deny the use of Facial Recognition


Five student protestors at the University of Miami sat nervously in a meeting with Dean Ryan Holmes last year where he reprimanded them for attending a protest that was not approved by the school. While Holmes educated the students about the proper policy for holding a demonstration, the students, who were all wearing masks at the protest, had another question on their mind: How did the school identify us? 

UM refused to answer. This left students wondering if the university had employed sophisticated facial recognition technologies. But the university’s police department, UMPD, denied having or using facial recognition software. 

It’s possible that facial recognition was not used to identify the student protestors. However, an extensive investigation uncovered a mountain of  evidence that calls into question the integrity of various statements UM made denying the use of facial recognition on campus. 

First, it’s important to establish what facial recognition software is. Facial recognition is an umbrella term that covers a variety of technologies. Software that can detect a face in an image and a technology that can determine the race of a person in a photo are both considered facial recognition. But, the kind of facial recognition software we are concerned with is a technology that compares an image or video of a person against a database of known people in an attempt to identify the individual. 

Technology of this kind is widely used across the United states, including by law enforcement. 

So, does the UMPD use facial recognition technology?

The evidence suggests the answer is yes. 

  • Dean Ryan Holmes said the university uses facial recognition software.

This admission was made to a group of students who were called into a meetings with the dean after attending the “Die-In" demonstration last year, according to meeting attendees Mars Fernandez and Esteban Wood. The protesters were demanding pandemic protections for subcontracted workers, students, staff, and faculty.

Holmes did not respond to three requests seeking comment for this project.

  • UMPD Police Chief David Rivero boasted about using facial recognition in several investigations. 

Rivero said, “the Florida Department of Law Enforcement (FDLE) has a facial recognition software that they use through all of the people that have been arrested in the state of Florida to compare it to our video or our photo of our bad guy, and we’ve identified a few thieves that way.” 

This statement was made a year ago, in a recorded Zoom meeting (15:20-15:40) with an introductory journalism class.

Rivero did not reply to our requests for comment. 

  • Rivero described using the Florida Department of Law Enforcement’s facial recognition technology, while denying it was facial recognition technology.

“Let’s say, our camera system catches somebody stealing your laptop, and we have a good facial of that person, I can submit that facial shot to FDLE and they’ll try to match it to somebody that’s been arrested and looks like that picture,” Rivero told Miami Hurricane reporter Naomi Feinstein in 2020. “Usually, they come up with either 10 or 20 names and they’ll submit us the names with the pictures and say, ‘look, our system was able to identify these 20 people that looked like your suspect’ … It is not really facial recognition; it is just comparing photos to photos … They have software that does that for them.”

Semantics aside, what Rivero is describing is considered facial recognition, by any definition. 

  • UM has access to the Face Analysis Comparison & Examination System, FACES, a facial recognition technology run by the Pinellas County Sheriff’s Office, according to two official documents. 

A 2015 training powerpoint for FACES, lists UMPD as an authorized FACES user. The powerpoint was obtained by the Georgetown Center for Privacy and Technology. 

Additionally, reporter Joey Roulette requested a list of all authorized FACES agencies from PCSO in 2019 for an article with the Orlando Sentinel. The University of Miami was on the list the sheriff's office returned to him. 

Authorized FACES agencies can conduct facial recognition searches across FR-Net, a database that houses 22 million Florida driver’s license and ID photos and 11 million law enforcement photos. 

Declined Interviews, Misleading Responses, and a Lack of Transparency 

UM has been less than forthcoming during this investigation. 

When The Miami Hurricane started looking into the possible use of facial recognition last year, Chief Rivero indicated UMPD does not use facial recognition, but declined to interview or comment further. 

However, Rivero’s CV mentioned facial recognition being part of a camera system he helped install. 

According to his CV, the camera system installed at UM features “video analytics which is the use of sophisticated algorithms applied to a video stream to detect predefined situations and parameters such as motion detection, facial recognition, object detection and much more.” 


Once his CV was found online by The Hurricane, Rivero reached out to clarify that his CV was misleading. 


“[The] camera system has all those capabilities, doesn’t mean we use them, but it has all those capabilities so it was kind of misleading … We don’t have facial recognition,” Rivero told the Hurricane. 


Rivero then backtracked on previous remarks about using the “Florida Department of Law Enforcement’s facial recognition technology.”


As stated above, Rivero told The Hurricane, “It is not really facial recognition; it is just comparing photos to photos … They have software that does that for them.”


Rivero refused further inquiries on the subject. 


When our media team started looking into facial recognition being used by the school this year, the university was similarly uncooperative.


Our investigative staff asked to interview Julian Carter, the night security manager for housing and residential life; Ramon Valdes, the director of safety and security at the medical campus; and Barbara Jones, the security manager administrator. The university did not approve these interview requests. While those particular staff members were willing to be interviewed, it is university policy that they are not allowed to talk to the media without prior approval. 


Additionally, we submitted several questions to the university for UMPD. They included:

“Is UM or UMPD able to identify students' locations through Wifi? How precise is this? Can UM or UMPD see where students swipe their student ID cards on campus? Has UM or UMPD used or have access to FACES, a technology used by the Coral Gables Police Department? Does UMPD have any internal regulation regarding what kind of investigation they would use this software for?”


The university ignored questions, and instead issued a statement from Jacqueline R. Menendez, the vice president of university communications, two months after our initial request. The statement failed to address the majority of the questions and only addressed facial recognition, further denying that it is used. 


Then, we asked the school for further clarification in light of Rivero’s admission of using facial recognition on a Zoom recording and the official document showing UMPD as an authorized FACES agency. 

This was the reply from Menendez:

“The University of Miami and the University of Miami Police Department (UMPD) do not utilize facial recognition technology. 

“On occasion, when UMPD is attempting to identify a suspect in a criminal case, it will provide a photograph of the suspect to either the Florida Department of Law Enforcement (FDLE) or the Pinellas County Sheriff’s Office (PCSO). That photograph is compared with a database of photographs of persons who have been arrested for criminal offenses and reside in that database, which is managed by and only accessible to, those agencies. This process is conducted by FDLE and PCSO and is independent of UMPD. The cross-referencing of photographs is similar to the situation when victims of a crime view police photos to potentially identify a suspect.”

The university appears to be using semantics to disguise the fact that UMPD is using facial recognition.

UM says the photo comparison process is independent of UMPD; therefore, UMPD does not utilize facial recognition—even though UMPD is sending images to agencies that do use facial recognition. PCSO has one of the largest databases in the country. 

Another part of the statement is simply false.

UM claims the database of photos is only accessible to PCSO and FDLE. However, this database, FR-NET, is accessible to any of the 275 authorized FACES agencies, including UMPD, according to PCSO’s own documents.

Finally, the last sentence of the statement is misleading.

The university tries to compare the photo analysis process to “the situation when victims of a crime view police photos to potentially identify a suspect.” This suggests the process is done by an individual person and not facial recognition software. 

Why Does it Matter?

The use of facial recognition technology by law enforcement has been an increasingly controversial topic, especially given its lack of federal regulation. 

Currently, the PCSO's only regulation for using FACES is to use the software, whenever reasonable, for official investigations only. 


Audits are not conducted to assure this rule is followed.

Some arguments against the use of facial recognition cite privacy concerns, how the technology is used, and the ineffectiveness of the algorithm, particularly when identifying women and people of color. 

To hear more about the greater impact of facial recognition, head over to our podcast, found below this article.

Home: Text
Home: Music Player

The Conversation: Transcribed

From privacy concerns, racial biases in the software, to a lack of regulation, there are many reasons people are apprehensive to the use of facial recognition software. This podcast takes a deeper look at those issues as well as exploring the lesser known benefits of facial recognition now and in the future.

Host: Jessie Lauck   00:01

Hello and welcome to the third and final part of this investigative special project. This podcast will be a conversation about the benefits and negative consequences of facial recognition. 


First, let's hear what students at UM think.


Interviewee: Julia McLeavy  00:14

So, I'm in favor of some facial recognition technology. I know they use it in medicine a lot to try to figure out different chromosome deficiencies and see if they actually have it or not just based on the face.  Facial recognition should definitely not be used on college campuses, especially in terms of people not knowing that it's going on and just walking around.

Interviewee: Addie Spain  00:36

I think in some cases of extreme crime. If there's a serial killer on the loose, yes, please, tap into my ring doorbell camera to get his face if you walked by my house. But, I feel like there is a high chance that would be abused. It'd be used a lot more leniently than it should be.

Interviewee: Stella Bordon  00:57

You don't know if the authority that's using it is abusing their power. There's also a lot of racial bias. But, without that it ideally could be good. It could be good for a lot of things like sex trafficking, like victims, missing people, finding them, it could be helpful for safety stuff. 


Host: Jessie Lauck  01:20

Additionally, we polled 165 people, 85% who are current college students. Reporter Jesse Lieberman is here to tell us the results of that poll.


Reporter: Jesse Lieberman  01:29

Thanks Jessie. So, we found that: 

  • 43% of people were okay with facial recognition being used on college campuses, but only to identify criminals. 

  • 20% of people believe facial recognition should not be used on college campuses for any reason.

Regarding privacy:

  • 94% said facial recognition could be an invasion of privacy. 

  • 13% of respondents believe that the benefits outweigh the privacy concerns

  • 27% believe the privacy concerns are greater than the possible benefits. 

So, obviously, we have lots of different opinions.

Host: Jessie Lauck  02:04

I can see that.

Reporter: Jesse Lieberman  02:06

Yeah, but overwhelmingly, 95% of people believe that technology should be at least regulated.

Host: Jessie Lauck  02:12

Interesting. Thank you, Jesse. 

Reporter Clare O'Connor has spent the last few months taking a closer look at the possible negative consequences of facial recognition technology. She's interviewed a cybersecurity expert, First Amendment lawyer and anti-facial recognition organizers.

Reporter: Clare O'Connor  02:28

That's right, Jessie. Like students, the biggest concern for many of the experts and organizers I talked to was the lack of regulation. There are currently no federal regulations of the use of facial recognition technology and few state and local laws. The people and experts I spoke to said that this often leaves regulations up to the organizations themselves that use the technology, which makes it easy to abuse and even easier to get away with.

Dave Maass is the Director of Investigations at the Electronic Frontier Foundation, the leading nonprofit that defends digital privacy, free speech and innovation. Here's what he had to say.

Interviewee: Dave Maass 03:06

I mean, facial recognition has gone largely under the radar. There's not a lot of transparency about how it's used, what the results are, how many mistakes it makes, what the accuracy rate is, what are the use cases for it, who's allowed to access it. There's this whole range of questions that are just unanswered in general.

Reporter: Clare O'Connor  03:24

Dave also leads the Atlas of Surveillance project, a database that houses surveillance technologies from law enforcement all over the country to try and bring more light to what technology is being used. He believes that this lack of transparency directly challenges our privacy rights.

Interviewee: Dave Maass  03:41

You know, the fact that most law enforcement in Florida believes that your images belong to them and you don't have a right to privacy over them should be concerning.

Reporter: Clare O'Connor  03:53

According to the Georgetown Law Center on Privacy and *Technology, nearly 50% of American adults are in a law enforcement, facial recognition database.

Host: Jessie Lauck  04:03

Wow, that's a ton of people in a database that might not even know they're being used for facial recognition purposes.


Reporter: Clare O'Connor  04:10

Yeah, and it's not just people who have broken the law in some way. In Florida, over 22 million driver's licenses and ID photos are part of the FR-Net database for facial recognition. Many other states use these kinds of photos as well.

Host: Jessie Lauck  04:24

That's interesting because, according to our survey, 39% of people said that they would feel that their privacy was being invaded, if they were in a facial recognition database. Plus, an additional 42% said that it would make them uncomfortable.

Reporter: Clare O'Connor  04:37

Yeah, and that's the problem. Many of those people don't even know. If you have a Florida driver's license, you are in a facial recognition database.  But, it's not just law enforcement agencies using people's images without their consent. Here's one incident Dave brought to my attention.

Interviewee: Dave Maass  04:54

So, the University of Colorado, Colorado Springs - their police department wasn't using face recognition as far as I know, but there were researchers, university researchers, who were developing facial recognition systems. They set up secret cameras on campus to take pictures of students to add to their database to train their algorithm. I think that most students would find that creepy. Regardless of whether the camera was there for face recognition or not, I think most students would find secret cameras collecting them, even if they're walking in public on campus, would find that creepy and totally unacceptable.


Host: Jessie Lauck  05:31

Yeah, I don't know how I would feel if that was happening at my school.


Reporter: Clare O'Connor  05:35

It does seem a bit creepy. And, it begs the question: how much of our privacy are we willing to give up for our security? I would at least like to know if my images were being collected.


Host: Jessie Lauck  05:45

Agreed. Now, Clare, besides privacy concerns, did you find any other problems people might have with the use of facial recognition?

Reporter: Clare O'Connor  05:51

Yeah, another big concern I ran into was what facial recognition was being used for. This is what cybersecurity expert and professor at UM, Lokesh Ramamoorthi, had to say.


Interviewee: Lokesh Ramamoorthi  06:04

I would say it's like a weapon. You can use it for good purposes and you can use it for bad purposes. The problem is you don't know where that information is going. It's a very thin line, and it's diminishing further and further.


Reporter: Clare O'Connor  06:18

FACES, for example, is used by 17 agencies of the federal government, including ICE and the FBI. There is little regulation around how FACES is used. FACES is run out of the Pinellas County Sheriff's Office, and their only regulation for their officers is to use the technology for official investigations only. But, their regulation does not include conducting audits to assure this rule is being followed. According to a 2019 article from the Orlando Sentinel, the Pinellas County Sheriff's Office has never conducted an audit and agents aren't required to log a reason for a search. This lack of oversight and regulation can make the system easy to misuse.

Host: Jessie Lauck  06:59

Absolutely, I can definitely see how that could happen and no one would find out about it.  

At least here at the U, we know that our school has access to FACES, but a big problem was that our school refused to tell us what internal regulation they had surrounding its use. So, even though we don't have evidence it was being used outside of identifying a criminal, we can't say for sure that it wasn't used in any other way. We can't guarantee it won't be used in that way in the future.

Reporter: Clare O'Connor  07:24

Absolutely. According to Dave, the use of facial recognition in schools is actually a different concern than its use in cities for everyday residents.

Interviewee: Dave Maass  07:35

Students really have a right to know of how information is being collected on them. I think that in cities and counties you usually have the ability to either vote for the sheriff or not vote for the sheriff. That's usually an elected position at the city level. If you don't like what the police is doing, you can go talk to your city council member. And if your city council member doesn't agree with you, you can vote against them at an election. But, when it comes to a college campus, students don't have the power to vote in or out the University of Miami president,  but students are nevertheless paying tuition that is funding these systems. There should be some sort of representation. There should be some sort of accountability to students. Also, students should have the ability to have this information before they decide to go to a school or not.


Host: Jessie Lauck  08:27

That actually makes a lot of sense. 

According to our survey, 97% of respondents said that they had the right to know if facial recognition is being used in their cities or campuses and nearly 99% said students, faculty, and staff have a right to know what general surveillance techniques are being used on their campus.


Reporter: Clare O'Connor  08:45

That doesn't surprise me, but it is unfortunate that our school told us in an email that it would be inappropriate to disclose basic investigative techniques used on campus.  

But, there is one final point I want to bring up about the use of facial recognition and that is its effectiveness. It can become detrimental when the datasets that are used to train the facial recognition algorithms have historically been composed of images of faces not fully representative of diverse populations. In a 2018 project called Gender Shades, the MIT Media Lab tested facial recognition algorithms for their accuracy in identifying gender and skin tones. Subjects were split into four categories: black men and black women and white men and white women. All three algorithms performed the worst on black women with error rates up to 34% higher than for light skinned males. The algorithms failed on one in three women of color. One part of the project that stood out to me was, as they tested women with darker and darker skin tones, the odds that their gender would be correctly identified came down to a coin toss. This highlights a larger problem that there needs to be more research on how to improve accuracy rates.

Host: Jessie Lauck  10:09

That is certainly concerning. 

One thing I found that's even scarier is that criminal defendants might not be able to challenge this very technology, that could be flawed, and is being used to identify them. According to a 2019 article written by the ACLU, two undercover cops purchased drugs from a black man in Jacksonville, FL in 2015. Instead of arresting the man on the spot, one officer held a phone to his ear, pretending to be on a call, while he snapped photos of the man selling them drugs. Later, the officers were unable to identify the man, so they sent the photo to a crime analyst who used a statewide face recognition system to compare the low quality photo against mug shots in the county's database. The program came back with several possible matches, the first of which was Willie Lynch. They arrested Lynch and charged him with the drug sale even though the algorithm only expressed a one star competence that Lynch was the correct person. But, the officers believed Lynch to be the man that sold them the drugs. During the trial, Lynch asked for the other booking photos the program returned as possible matches, and the court refused his request. This decision was later upheld by a Florida appeals court. When it was taken all the way up to the Florida Supreme Court, they declined to hear it based on lack of jurisdiction. This case demonstrates a huge problem when a possibly unreliable technology is used to initially identify someone, and then they can't even challenge that technology in court by viewing other matches it produced.

Reporter: Clare O'Connor  11:41

Yeah. I mean, this example further highlights that we need more accountability and transparency with facial recognition, the developers behind the software, and the companies using it.

Host: Jessie Lauck  11:51

Definitely. Clare, thank you for providing us with all this information. It was very valuable.  

So, we know that the topic of facial recognition has a lot of serious concerns surrounding it, but I wanted to further explore what the benefits are. Reporter Maya Broadwater has spent some time researching just that. Maya, when many consider facial recognition an invasion of privacy, what can you tell me about the benefits of this technology?


Reporter: Maya Broadwater  12:15

Well, Jessie, unfortunately, UMPD, the Pinellas County Sheriff's Office, the Coral Gables Police Department and Miami Dade police all declined to comment for this project.  But, luckily, I was able to research and find many benefits to using facial recognition. 

First of all, it's already being used in so many ways in our everyday lives: unlocking your phone, organizing photos in your albums based on who's in them and, of course, it's being used as a security measure. So, more and more, we're seeing facial recognition being used at security checkpoints, in airports, at train stations, etc. But, since there is no contact required for facial recognition, like there is with, maybe, fingerprinting or other security measures, facial recognition is offering a quick and automatic verification experience, which is more important now in the age of COVID than ever before. And, it's looking like in the future, you're going to be able to check out at stores with just your face. So, instead of using a credit or a debit card, your face will be scanned meaning preventing fraud will be easier than ever.


Host: Jessie Lauck  13:20

Now, I understand that facial recognition is already being used in our day to day personal lives, but can you talk to me a little bit more about how this tech can be beneficial when it's used by law enforcement?

Reporter: Maya Broadwater  13:30

Yeah, of course. So, facial recognition has really changed the game for law enforcement agencies. In the past, police had to do manual searches, such as looking through mug shots or asking the public to help out, which can obviously be very time consuming and expensive. Now, police can use facial recognition technology to investigate low dollar crimes like car break-ins. They can also track missing persons, identify wanted criminals, and can also use it for early threat detection.


Host: Jessie Lauck  14:02

What do you mean by early threat detection?


Reporter: Maya Broadwater  14:04

For example, if police received a credible threat of a person planning something, maybe like a shooting or a bombing at a venue, they can use real time facial recognition to quickly monitor the crowd and receive an alert if the person of interest arrives.

Host: Jessie Lauck  14:19

Okay, so super interesting. Now, are there any specific cases that come to mind where facial recognition was used successfully?

Reporter: Maya Broadwater  14:25

Sure. So something really cool is that the largest facial recognition system in the entire country is based right here in Florida. Between 2014 and 2020, the system had over 400 successful outcomes. One of those that really stood out to me was from back in 2017. After a high speed chase, police were finally able to shoot out the tires of a stolen car and bring it to a halt, but when they went to arrest the driver he had passed out because he had stuffed some kind of drug in his mouth. He also had no identification cards on his body. Weirdest of all, he had chewed his fingerprints off. But, luckily investigators were able to turn to the facial recognition system, and they were able to find him through that database.

Host: Jessie Lauck  15:15

Wow. So that's really amazing and a very odd story, but, luckily, it sounds like this technology can be very helpful in catching criminals.

Reporter: Maya Broadwater  15:23

Definitely. And, they can be used to help the victims as well. So, one police department in Fort Worth, Texas is extending their use of their facial recognition software to search for missing children online that might be the victims of sex trafficking. In the future, law enforcement agencies around the country are trying to develop opportunities to use facial recognition to help respond, in real time, to AMBER alerts for missing children.

Host: Jessie Lauck  15:48

Now, that's amazing. While I'm very excited to hear about the benefits of this technology, how would you respond to the argument against its effectiveness, saying that there are racial and gender biases?


Reporter: Maya Broadwater  15:59

Well, that argument is definitely something that we all need to take very seriously. But, at the same time, many of their critiques about racial bias and facial recognition algorithms refer to much older technologies. So, newer versions of these facial recognition systems perform more accurately, and the technology continues to improve over time.  

Also, according to the Information Technology and Innovation Foundation, it's really important to consider the confidence thresholds of any studies that are showing racial bias. So, for example, an ACLU study compared photos of members of Congress to a mugshot database and found a number of false positives. However, the ACLU used a lower confidence threshold than recommended. They used an 80% confidence threshold for determining matches, which is okay for certain situations, like maybe when you're tagging people on social media, but it's definitely not okay for law enforcement. It's recommended to use a 99% threshold for law enforcement purposes, which brought back zero false matches.  

Finally, the most recent results of the one to one facial recognition matching test from the National Institute of Standards and Technology shows the top performing algorithm having a lower error rate on black females than on white males.  

We also know that humans have a higher error rate in recognizing faces outside of their own race than faces with the same race as them. So, hopefully, as this technology evolves, we can further eliminate that bias.

Host: Jessie Lauck  17:27

That's super interesting. I actually never thought about it that way. I'm glad you're shedding light on the possible shortcomings of these different studies that I've definitely read before.


Reporter: Maya Broadwater  17:35

Yeah, definitely. I just really want to reiterate that the benefits of facial recognition technology don't stop at the police station. Facial recognition technology can also help in identifying deceased persons or people who might not be able to identify themselves, such as an individual who shows up unconscious to the hospital. This would kind of streamline the identification processes for hospitals, so they can alert family members sooner and then they can also devote more of their efforts to more pressing hospital issues.


Host: Jessie Lauck  18:04

Definitely. I can see how that would be greatly beneficial. Maya, thank you again for highlighting the solutions.


Reporter: Maya Broadwater  18:10

Yeah, of course. Thank you, Jessie.


Host: Jessie Lauck  18:12

So, in favor for facial recognition or against it, the tech is here to stay. Now, Clare and Maya, what are your thoughts on the best way to increase the benefits of facial recognition while mitigating the negative consequences?


Reporter: Clare O'Connor  18:24

Definitely regulation and transparency on its use.


Reporter: Maya Broadwater  18:28

I totally agree. I think the concerns over facial recognition are valid and the best way to deal with it are to be very clear of how and why it's used and also set up policies to prevent its misuse.

Host: Jessie Lauck  18:40

Which is actually why we did this project in the first place. We wanted to shed light on how facial recognition is used, particularly at our university.

Reporter: Clare O'Connor  18:47

Yeah, and unfortunately, the school wasn't the most helpful in this process. They even went as far as denying they use facial recognition by saying they outsource the process of identifying someone from a photo to the Pinellas County Sheriff's Office and Florida Department of Law Enforcement, two agencies we know use facial recognition software.

Reporter: Maya Broadwater  19:07

And, on top of that, they even compare the process to when a victim identifies someone in a police lineup, which of course is a totally different process. It seems like they were trying to lead us to believe that the Pinellas County Sheriff's Office and the Florida Department of Law Enforcement do the photo comparison process by hand which of course we know would take far too long.

Host: Jessie Lauck  19:27

That, and the schools refusal to provide us with further information and seemingly misleading information is, at least in my opinion, kind of dangerous. I think that, like Dave said, students have the right to know what's going on at their universities so they can make informed choices on where to go to school and how to spend their tuition money.  

Thank you to everyone who has taken a look at our project. I hope you learn something new about facial recognition and can be more informed going forward. 

The students that contributed to this project were myself Jessie Lauck, Clare O'Connor, Maya Broadwater, Jesse Lieberman and Patrick McCaslin. I want to especially thank Poynter for their contributions to this project such as providing training and funding.

Home: Text
bottom of page