Lumina Testifies Before the Florida House of Representatives
“How do we leverage the power of technology without sacrificing constitutional liberties? How do we ensure we are doing everything we can to keep our communities safe without turning our society into the Minority Report?”
In addition to Lumina’s Doug Licker and Jessica Dareneau, other panelists included Dr. Russell Baker, CEO & Founder of Psynetix and Wayne A. Logan, a professor of law at Florida State University.
No Standardized Methodology
Beginning on the issue of using technology to keep communities safe, Psynetix’s Baker noted that the signs of violence or potential terrorism are often missed because there is no standardized methodology to collect, report and disseminate crucial information indicative of these potential acts – and that even if the data is available, it becomes siloed.
The problem, comes in the mass amounts of data available on the web – some 2.5 quintillion bytes of data are added to the internet daily. And, constrained resources from law enforcement agencies to analyze the data and respond.
Real-time Detection of Digital Evidence
In the slide presentation, Lumina shared a quote from the RAND
Corporation which noted: “Most law-enforcement agencies in the United
States, particularly at the state and local level, don’t have a whole lot of
capability and technical people to manage and respond to digital evidence more
generally, much less real-time detection.”
That’s where technologies like Lumina’s Radiance platform
can be valuable for law enforcement.
“The power of our Radiance platform is two-fold – its
ability to ingest massive amounts of unstructured, open source data and its
real-time ability to analyze that information to predict and prevent
organizational risks and threats,” Dareneau said. “It does this through
purpose-built, best-in-class algorithms that can overcome the challenges of
massive unstructured data ingestion and prioritization.”
The question of each of our publicly available digital footprints, and law enforcement’s ability to use that information in an investigation was widely discussed at the hearing.
Is Privacy Dead?
“Digital dossiers exist today on us all, which law
enforcement can and will readily put to use in its work such as by means of
computers, patrol cars and even hand-held devices,” Logan testified. “And why
should law enforcement not be able to harness the crime control tools enabled
by technological advances, such as machine learning targeting massive data
“In my view, my personal view, they should be able to do so
but in a regulated manner,” he continued.
But, what should those regulations look like, and how best
to approach the balance between privacy and safety?
Logan noted that the European Union, California and Illinois are all taking steps towards data protection measures, and could be models for Florida to follow.
Transparency is Key
Dareneau said many of the policies being implemented relate to transparency.
“Transparency is so important, and that is what so many of
these other jurisdictions are enacting in their legislation – requirements that
you disclose what you are collecting and then how you are using it,” she
and terms includes exactly what we are collecting, how we are using it and who
we could provide it to.”
As the hearing ended, Chairman Grant reiterated the work before his subcommittee to understand and delineate between private data and public information. “This body is committed to acting,” he said.
Committed to Acting
When legislative session begins January 14, 2020, it’s clear
that this topic will be a key focus for this subcommittee and the broader
As Logan noted, “Technology is really potentially a game changer here. The question is whether it will be permitted, what limitations are going to be put on it and what accountability measures will be put in place. It’s just a different era. We need to air the potential concerns here, and we need to transparently deliberate them and decide the issues.”
You can watch the hearing and review the materials here.
(S4 is a mobile app that allows people to report concerning behaviors in real time. It’s short for See Something, Say Something).
The first alert: Tuesday Afternoon
This alert was from a high school student.
The student expressed concern that a best friend was at risk
It turns out that the two students had recently lost another
close friend to suicide. Since that
time, the friend at risk had been distant and negative, and showed other
warning signs, which you can read more about from the American
Foundation for Suicide Prevention.
The student who sent the S4 alert wanted to make sure the
best friend got help before it was too late.
Not surprisingly, the student wished to remain anonymous. But the student shared the school information and the name of the friend. Our S4 app also validated the location from which the alert was sent.
80% of those considering suicide give some sign of their intentions
This report was a serious concern. Statistics show that 80% of those
considering suicide give some sign of their intentions, and often those
signs are communicated to the people closest to them.
We acted immediately, calling the school and sharing the
information with the administration, who confirmed the recent suicide and
thanked us for the report.
A person in time of crisis would get the help they need.
The second alert: Wednesday late morning
Just 20 hours later, we received another S4 alert.
This one was different.
The person reporting the concern had innocently moved a postal package for a neighbor. Then, the person noticed that the package had a marking indicating that it was from a company that sells bulletproof armor.
What to do with this information? Buying armor isn’t illegal. But, why was this neighbor concerned? Was there something else to the report?
More than 75% of the attackers in mass violence events exhibit concerning behaviors
Through a search of all the open source data on the
Internet, OS-INT found publicly available social media images of the subject
holding an IED, raising concerns that perhaps there was more to investigate.
We sent the S4 report, and the findings from OS-INT to the authorities, so they could determine appropriate next steps.
The third report: Wednesday afternoon
While we were working the bulletproof armor S4, another alert came in.
Again, it was a report concerning potential mass violence.
But this time, it was at a school.
The report indicated that a student had discussed bringing a gun to school the next day. The report included the student’s name, and the school that he attended.
93% of the attackers in mass violence events made threatening or concerning communications
The same Secret Service report we mentioned previously,
tells us that of the mass attacks in 2018, 93% of attackers made threatening or concerning communications prior to the
That is why – like the bulletproof armor S4 report – we ran the student’s name through Radiance OS-INT.
Radiance quickly sent us a link to a publicly available
YouTube channel where a person with the same name as the student shows himself
executing a shooting rampage in a video game. We thought this was important
additional information to share with the local authorities.
Within minutes, we called the police, and learned they had received another tip surrounding the same student and were following up on the reports.
The power of S4
When we launched our S4 app, we knew the value it would bring to our clients as they work to keep their school and corporate campuses safe.
But we also understood the potential power it had for the
We knew we had to make the app available to others. And it had to be free of charge.
Since making our app available, we have had thousands of
downloads and hundreds of reports.
The truth is, we never like it when those reports come in. The thought that someone might want to do harm to themselves, or to others, keeps us up at night.
Be the light in your community
But, we know See Something, Say Something works. And we’re committed to using our technology
to help make a difference.
We encourage you to do the same.
Download the app today at the App
Store or from Google
Play, and help be the light in your community.
And, if you are in a suicide crisis, or know someone who is the National Suicide Prevention Lifeline is 1-800-273-8255.
As the summer draws to a close and students return to campus, schools across the country are incorporating active shooter response training into their procedures and protocols. The drills are just one component of overall safety preparedness efforts, being undertaken at the state, federal and local levels.
STRONG Ohio Plan Includes Social Media Scans
While response trainings on school campuses have become an increasingly common practice, the focus is even more pronounced in light of the recent mass shooting attacks in Dayton and El Paso.
In response to the shootings in Ohio, Governor Mike DeWine unveiled his STRONG Ohio plan, designed to reduce gun violence. The state created a School Safety Center, which will review school emergency management plans and offer risk threat and safety assessments, consolidate school safety resources on saferschools.ohio.gov, promote the use of a tip line to anonymously report suspected threats and scan social media and websites to identify people suggesting acts of violence.
Increased Arrests for Threatening Comments
Increased precautions aren’t just being taken at schools, and for good reason. Following those tragic events, the FBI ordered a new threat assessment to thwart future mass attacks in the country.
Be Prepared: Take notice of surroundings and identify potential emergency exits. Be aware of unusual behaviors and report suspicious activities to security or law enforcement.
Take Action: If an attack occurs, run to the nearest exit and conceal yourself while moving away from the dangerous activity. If you can’t exit to a secure area, protect yourself by seeking cover.
Assist and React: Call 9-1-1, remain alert and stay aware of the situation. Help with first aid when it is safe, and follow instructions once law enforcement arrives.
Part of your preparation can include downloading for free Lumina’s See Something Say Something app. It’s a crowd-sourced, mobile application that allows users to confidentially report concerns in real time.
You can learn more about S4 and download it here. It’s one part of our comprehensive, AI-driven risk management platform, Radiance.
The Contagion Effect Part 1: What is the impact of news coverage on suicidal behaviors?
Netflix’s recent decision to remove the controversial scene of a student committing suicide in its series, 13 Reasons Why, came nearly two years after mental health care professionals first approached the company raising concerns about what is called the “contagion effect.”
In other words, the potential for an increase in teen suicide, inspired by the show.
Although experts say that it is difficult to prove that news coverage or entertainment media caused a person to commit suicide, studies do show correlation.
the concerning trends related to several high-profile celebrity suicides.
example, after the death of Robin Williams in 2014, there was an almost 10 percent increase in deaths by suicide. Additionally,
after the suicides of Soundgarden’s lead singer Chris Cornell and subsequent
suicide of his friend, the lead singer from Linkin Park, the National
Suicide Prevention Lifeline (800-273-8255) received a 14 percent increase in calls.
The Contagion Effect Part 2: What can we learn from social media and the Internet?
Research suggests that Internet searches mirror real world suicide rates. A research letter published in JAMA Internal Medicine found there were one million (19 percent) more Internet searches about suicide after 13 Reasons Why aired. Searches included “how to commit suicide” (26 percent), “how to kill yourself” (9 percent), and “commit suicide” (18 percent). Some of the increased searches were for seeking help—including “suicide hotlines” (12 percent) and “suicide prevention” (23 percent). In fact, the University of Manchester’s National Confidential Inquiry into Suicide and Safety in mental health found that 25 percent of youth who died by suicide conducted a suicide-related Internet search shortly before their deaths.
In addition to the increased number of Internet searches, the research firm Fizziology found that in its first week of airing, 13 Reasons became the most tweeted about series in Netflix history with more than 3.5 million tweets. The show also generated a variety of social memes, including ones that mocked the character and made light of the experiences that led to her suicide. For students in crisis, the added exposure to negative social media is another difficult input to process.
Implications for Schools and Campuses
Increased Internet searches and social media engagement aren’t surprising results, considering the popularity of the series with teens. In fact a study by the Pew Research Center, found that 95 percent of teens have access to a smartphone and 45 percent say they are online ‘almost constantly.’
to research published in the journals Crisisand The American Journal of Public
real-time monitoring of online behavior can be a viable tool for assessing
suicide risk factors on a large scale. This research also concludes that social
media provides a channel that may allow others to intervene following an
expression of suicidal thoughts online.
So, what does all of this mean for schools and campuses charged with preventing youth suicide and keeping their students safe?
Recognizing the Warning Signs
In response to the show, The National Association of School Psychologists (NASP) has provided guidance for educators in how to engage in supportive conversations with students and provide resources to those in need. They encourage making parents, teachers, and students aware of suicide risk warning signs, taking warning signs seriously, and establish a confidential reporting mechanism for students.
Recognizing that those warning signs may be displayed on students’ social channels, one challenge is the sheer amount of data school counselors would have to search through to identify concerning trends with their students. The 3.5 million tweets for 13 Reasons why seems manageable in the context of the 2.5 quintillion bytes of data added to the web each day.
AI Technologies and Assessing Risk
Radiance provides two important tools in helping keep campuses safe and predict and prevent student suicide.
Open Source Intelligence (OS-INT) scours all open source data across
the entire Internet, looking for behavioral affinities related to suicide and
other threats to students. For suicide ideation alone, OS-INT performs more
than 7,000 searches of publicly available web data, and returns prioritized results
in five minutes. A manual search on a
traditional web engine would take more than three weeks for one person to
further amplified with Radiance’s Human Intelligence (HUM-INT), which is powered by the See
Something Say Something (S4) app. This app provides the confidential mechanism
recommended by NASP.
Eight out of ten people considering suicide give
some sign of their intentions, and the S4 app allows students to confidentially
share concerns about their classmates in real time, providing school
administrators with early insights to support those in need.
Since 2011, the rate of mass shootings has more than tripled in the United States and schools have frequently been the target. While added physical security measures are a step towards preventing these attacks from happening on campuses, there is more we can do. Identifying a school shooter before their plans turn into action is the best method of prevention.
The FBI has found that for shooters under the age of 18, peers and teachers were more likely to observeconcerning behaviorthan family members. To get more people to report concerning behavior, we have to know what to look out for. Here is what we know about school shooters from the data.
They experience high levels of stress.
A common characteristic found in high risk individuals was the failure to navigate major stressors in their lives. One of the top ranked stressors was mental health (not mental illness)in 62% of those analyzed.
When it comes to providing students with resources for their mental health, most schools are underfunded. Of those analyzed, only 53% of schools reported they provided training on referral strategies for students with signs of mental health disorders. This presents a great area for improvement and could prove to be more effective than physical security measures.
There are other stressors that can be an identified as well. Some of these included “conflict at school”, which was present in 22% of incidents, and conflict with friends or peers which was present in 29% of shooters.
They display concerning behavior.
Most schools fail to observe concerning behaviors or develop intervention strategies if noticed.
Figure 9: James Silver, Andre Simons, and Sarah Craun, “A Study of the Pre-Attack Behaviors of Active Shooters in The United States Between 2000 and 2013.
According to the most recent U.S. Department of Education’s Indicators of School Crime and Safety study, only 48% of schools reported providing training on the early warning signs of violent student behavior. Yet teachers still observed concerning behavior 75% of the time. Sadly, however, 83% of the time the behavior was only communicated to the shooter, or nothing was done (54%).
These facts bring up many more questions than answers. How risk adverse should administrators be when reporting concerning behavior? How do you report behavior without creating further grievances? Regardless of your unique approach for your institution and community, the time to be aware, alert, and prepared to act is before an attack not just during and after.
They plan ahead.
The FBI’s study found that 77% of shooters will take a week or longer to plan their attack, and 46% spend a week or longer preparing. These activities can produce common identifiable behaviors and help raise the red flag.
Figure 6: James Silver, Andre Simons, and Sarah Craun, “A Study of the Pre-Attack Behaviors of Active Shooters in The United States Between 2000 and 2013.
They leak information.
An information leak by the potential shooter is another commonly observed behavior. In fact, it was found that they leaked their intentions about 56% of the time.
A good majority of these information leaks take place in a student or employee’s digital lives. These online leaks often go unnoticed or unheard by those who could potentially intervene. This brings to light the importance of taking what individuals say, even if online, seriously.
Yet, only 6% of schools reported that staff resources were used to address cyberbullying. Shockingly, only 12% of schools reported that cyberbullying happened at least once a week at school or away, which either reflects underreporting, schools are under-resourced in monitoring and addressing the issue, or that many school are oblivious to the problem.
They target familiar places.
A known connection to the location of the attack is another factor that comes into play. In 73% of cases the shooter had a known connection to the location of attack. Almost all perpetrators under 18 (88%) targeted a school or a former school.
Figure 6: James Silver, Andre Simons, and Sarah Craun, “A Study of the Pre-Attack Behaviors of Active Shooters in The United States Between 2000 and 2013.
Looking into this further, in 64% of cases, at least one victim was specifically targeted by the shooter. In cases where a primary grievance could be identified, the most common was adverse interpersonal action against the shooter.This means that shooters commonly target individuals they have grievances with: students, teacher, or administration.
Identifying a potential shooter before an incident can be the difference between a life and death situation. With the use of predictive analytics, the potential to identify these patterns is more advanced than ever. School shootings and school security have been under-researched for decades. Lumina Analytics has been building and perfecting these exact technological tools to help keep schools safe.
It’s not always easy for young people to articulate their problems. A student who regularly attends class and receives good grades could also be fighting an addiction. A teen constantly smiling for Instagram photos could actually be depressed. For friends and family of the person struggling, recognizing the warning signs of distress might not come easily.
Artificial intelligence can act as a voice for people dealing with various internal issues. It can also notify loved ones or even officials when a person needs help. The following two stories serve as examples of potential tragedies that could be avoided thanks to artificial intelligence:
Using Artificial Intelligence to Fight Cyber Bullying
Hailey was in her dorm room staring at her phone. A stranger had posted another fake story about her. Hailey knew if she reported it, the imposter would just create a new account or use a website that allows anonymous posts.
Hailey is one of more than 20% of college students being cyberbullied. She struggled with bullying and depression throughout her first two years of college before her friends and family were able to help her, but she could have gotten help a lot sooner with artificial intelligence. As soon as the menacing messages appeared, cutting-edge predictive analytics paired with human analysis could have combatted the issue much earlier.
Catch Suicidal Tendencies Early with Artificial Intelligence
Ana had been a star student in high school. She held a part-time job, ran track and was in a serious relationship. During her freshman year of college, she became increasingly depressed. One night she texted heart emojis to all her friends, wrote a goodbye letter to her parents, and attempted suicide. Ana’s friends found her and called 911 in time.
While she was lucky, suicide has risen to become the second-leading cause of death among Ana’s age group. Ana, and so many others like her, could have benefited from help and treatment as soon as predictive analytics powered by artificial intelligence flagged her online searches and habits as possible suicidal tendencies.
As mental health problems become more common, and troubling behavior migrates online where it is harder to identify using traditional methods, many schools are struggling to adapt. To face these new challenges, innovative solutions are needed.
What if a sophisticated system could immediately alert student services to the problems their students face, like what should have happened for Hailey and Ana. The idea of counselors and health care professionals being guided to students’ darkest struggles is not some distant future. It’s possible today thanks to Lumina – a Predictive Analytics firm which uses artificial intelligence and open-source data to combat some of society’s most pressing issues. Powered by cutting-edge artificial intelligence and human analysis, Lumina’s newest solution can identify harmful behavior online and alert people who can help.
By working with schools, Lumina can help counselors, student services, and even security officers adapt to new digital landscapes related to bullying, mental health, drug misuse, and other challenges. With new threats emerging every day, taking full advantage of artificial intelligence will allow schools to meet these challenges head-on.