Advancing Public Safety and Protecting Privacy

Advancing Public Safety and Protecting Privacy

Lumina Testifies Before the Florida House of Representatives

“How do we leverage the power of technology without sacrificing constitutional liberties?  How do we ensure we are doing everything we can to keep our communities safe without turning our society into the Minority Report?  

These were the opening questions posed by Florida State Representative James Grant at a recent hearing focused on Using Technology to Advance Public Safety and Privacy in the Florida House of Representatives.

Lumina joined a panel of expert witnesses to answer these and other questions from members of the Criminal Justice Subcommittee.  

In addition to Lumina’s Doug Licker and Jessica Dareneau, other panelists included Dr. Russell Baker, CEO & Founder of Psynetix and Wayne A. Logan, a professor of law at Florida State University.

No Standardized Methodology

Beginning on the issue of using technology to keep communities safe, Psynetix’s Baker noted that the signs of violence or potential terrorism are often missed because there is no standardized methodology to collect, report and disseminate crucial information indicative of these potential acts – and that even if the data is available, it becomes siloed.

Lumina expanded on those complications, noting that 93 percent of those carrying out a mass violent attack make threatening communications prior to the event – including on social media –  and that 75 percent of terrorists used the internet to plan an attack.

The Internet is Useful to Everyone…Including Bad Actors

“The internet, it turns out is useful to everyone…and that includes bad actors,” Licker testified. 

“The UN, the FBI and the Office of the Director of National Intelligence (ODNI) support the use of new technologies to help mine the publicly available information on the internet to help prevent, predict and deter attacks in the future,” he continued.

The problem, comes in the mass amounts of data available on the web – some 2.5 quintillion bytes of data are added to the internet daily.  And, constrained resources from law enforcement agencies to analyze the data and respond.

Real-time Detection of Digital Evidence

In the slide presentation, Lumina shared a quote from the RAND Corporation which noted: “Most law-enforcement agencies in the United States, particularly at the state and local level, don’t have a whole lot of capability and technical people to manage and respond to digital evidence more generally, much less real-time detection.”

That’s where technologies like Lumina’s Radiance platform can be valuable for law enforcement.

“The power of our Radiance platform is two-fold – its ability to ingest massive amounts of unstructured, open source data and its real-time ability to analyze that information to predict and prevent organizational risks and threats,” Dareneau said. “It does this through purpose-built, best-in-class algorithms that can overcome the challenges of massive unstructured data ingestion and prioritization.”

The question of each of our publicly available digital footprints, and law enforcement’s ability to use that information in an investigation was widely discussed at the hearing.

Is Privacy Dead?

“Digital dossiers exist today on us all, which law enforcement can and will readily put to use in its work such as by means of computers, patrol cars and even hand-held devices,” Logan testified. “And why should law enforcement not be able to harness the crime control tools enabled by technological advances, such as machine learning targeting massive data sources?”

“In my view, my personal view, they should be able to do so but in a regulated manner,” he continued.

But, what should those regulations look like, and how best to approach the balance between privacy and safety?

Logan noted that the European Union, California and Illinois are all taking steps towards data protection measures, and could be models for Florida to follow.  

Transparency is Key

Dareneau said many of the policies being implemented relate to transparency.

“Transparency is so important, and that is what so many of these other jurisdictions are enacting in their legislation – requirements that you disclose what you are collecting and then how you are using it,” she testified. “So we try to stay on top of that, and make sure our privacy policy and terms includes exactly what we are collecting, how we are using it and who we could provide it to.”

As the hearing ended, Chairman Grant reiterated the work before his subcommittee to understand and delineate between private data and public information.  “This body is committed to acting,” he said.

Committed to Acting

When legislative session begins January 14, 2020, it’s clear that this topic will be a key focus for this subcommittee and the broader legislature. 

As Logan noted, “Technology is really potentially a game changer here. The question is whether it will be permitted, what limitations are going to be put on it and what accountability measures will be put in place. It’s just a different era.  We need to air the potential concerns here, and we need to transparently deliberate them and decide the issues.”

You can watch the hearing and review the materials here.

How to read this report

About Radiance

When you entered a name to try Radiance OS-INT, Radiance went across the entire internet and gathered all documents associated with that name. Based on the use case you chose (Reputational Risk, Education, Know Your Customer, or Security Clearance), Radiance filtered down the results through specific Behavioral Affinity Models (BAMs) to show you what is most relevant to your use case.

Our industry-leading technology uses AI and machine learning to scour all publicly-available, open-source web data. It correlates those terms with the name being searched, and pulls all applicable content into a comprehensive report. The results are prioritized, making it easy for you to further analyze the findings and determine potential risk.

Keep in mind, Radiance’s deep web search capabilities are designed to find the needle in the haystack. It’s power is in finding all the web
content that includes the name being searched and a term related to the corporate reputation bundle. The fact that Radiance flags the web
page does not necessarily mean a risk is involved. The report is designed as a tool to help expedite risk analysis of open source data.


On the first page of your results, you will see the  name that you ran through Radiance. Underneath the name are a few key statistics:

  • Total Links Processed: this is the total number of links across the internet associated with that name, or part of that name.  Radiance does not make any determination that the name(s) found within the links processed are the same individual that you submitted through Radiance. 
  • Links Identified with Affinity: out of the total links associated with the name, this is the number of links that were relevant to the BAMs associated with your use case
  • Flagged: the number of links highlighted in the system to be represented in this report. If the number of links flagged matches the number of links identified with behavioral affinity, then these are all of the results that Radiance returned.

The profile relating to the name can be found under these statistics. Here, in blue, you will find the name of the Behavioral Affinity Model followed by the terms that came back associated with that affinity. By each term, you will see one or more hyperlinks. This is the actual content that you can view to determine if there is risk associated with the individual you submitted into the system.

Going through the Results

As mentioned above, results are organized by a term, related to the specific Behavioral Affinity Model, and a link to the actual content. To evaluate the page itself, you may want to understand where the name has appeared on the page. To do so, search for the name you submitted by clicking Ctrl+F and typing the name in the search bar that appears. Then you can search for the relevant term that appeared. Use the Ctrl+F function again to search for the term.

Evaluating Results

Take into context the page you are looking at, the name you ran, the BAM term that brought the result back, and any other information you have on the individual in question. Then it is up to you to determine if the individual is the same person you were interested in researching and if the information is of relevance to your specific needs or use case.

Your ‘Bundle’ and Behavioral Affinity Models

At the end of the profile you will see your chosen use case. This use case is represented by a bundle of Behavioral Affinity Models representing the various types of risk associated with that use case. All of the BAMs will be listed with an explanation relating to each one.

Please remember, the information produced by Radiance is obtained from publicly available online sources and may contain obscene, offensive, or violent material.  Radiance identifies key words and phrases that could potentially be associated with various risks and safety concerns.  Radiance does not determine or make any assessment, opinion, or fact finding on whether the individual, terms, or entity you submitted is associated with such risk or safety concerns, as that is up to you to determine. 

This report is not a consumer report, as defined by the Fair Credit Reporting Act (FCRA) and you should not use this report as a background check for employment, tenant or credit purposes.  Radiance should be used for research and/or intelligence gathering purposes only.  You are responsible for complying with the FCRA should you use any information contained in this report for any purpose under the FCRA, and for complying with any other applicable laws.   

You must be over 13 years of age in order to try Radiance.   In addition, you may not submit the name of an individual that is under the age of 13 years.   By agreeing to our Terms of Service, you agree to comply with COPPA and are over 13 years of age.  COPPA stands for Children’s Online Privacy Protection Act – a set of rules established in 1998 by the Federal Trade Commission.   Click here for more information.