Lumina Testifies Before the Florida House of Representatives
“How do we leverage the power of technology without sacrificing constitutional liberties? How do we ensure we are doing everything we can to keep our communities safe without turning our society into the Minority Report?”
In addition to Lumina’s Doug Licker and Jessica Dareneau, other panelists included Dr. Russell Baker, CEO & Founder of Psynetix and Wayne A. Logan, a professor of law at Florida State University.
No Standardized Methodology
Beginning on the issue of using technology to keep communities safe, Psynetix’s Baker noted that the signs of violence or potential terrorism are often missed because there is no standardized methodology to collect, report and disseminate crucial information indicative of these potential acts – and that even if the data is available, it becomes siloed.
The problem, comes in the mass amounts of data available on the web – some 2.5 quintillion bytes of data are added to the internet daily. And, constrained resources from law enforcement agencies to analyze the data and respond.
Real-time Detection of Digital Evidence
In the slide presentation, Lumina shared a quote from the RAND
Corporation which noted: “Most law-enforcement agencies in the United
States, particularly at the state and local level, don’t have a whole lot of
capability and technical people to manage and respond to digital evidence more
generally, much less real-time detection.”
That’s where technologies like Lumina’s Radiance platform
can be valuable for law enforcement.
“The power of our Radiance platform is two-fold – its
ability to ingest massive amounts of unstructured, open source data and its
real-time ability to analyze that information to predict and prevent
organizational risks and threats,” Dareneau said. “It does this through
purpose-built, best-in-class algorithms that can overcome the challenges of
massive unstructured data ingestion and prioritization.”
The question of each of our publicly available digital footprints, and law enforcement’s ability to use that information in an investigation was widely discussed at the hearing.
Is Privacy Dead?
“Digital dossiers exist today on us all, which law
enforcement can and will readily put to use in its work such as by means of
computers, patrol cars and even hand-held devices,” Logan testified. “And why
should law enforcement not be able to harness the crime control tools enabled
by technological advances, such as machine learning targeting massive data
“In my view, my personal view, they should be able to do so
but in a regulated manner,” he continued.
But, what should those regulations look like, and how best
to approach the balance between privacy and safety?
Logan noted that the European Union, California and Illinois are all taking steps towards data protection measures, and could be models for Florida to follow.
Transparency is Key
Dareneau said many of the policies being implemented relate to transparency.
“Transparency is so important, and that is what so many of
these other jurisdictions are enacting in their legislation – requirements that
you disclose what you are collecting and then how you are using it,” she
and terms includes exactly what we are collecting, how we are using it and who
we could provide it to.”
As the hearing ended, Chairman Grant reiterated the work before his subcommittee to understand and delineate between private data and public information. “This body is committed to acting,” he said.
Committed to Acting
When legislative session begins January 14, 2020, it’s clear
that this topic will be a key focus for this subcommittee and the broader
As Logan noted, “Technology is really potentially a game changer here. The question is whether it will be permitted, what limitations are going to be put on it and what accountability measures will be put in place. It’s just a different era. We need to air the potential concerns here, and we need to transparently deliberate them and decide the issues.”
You can watch the hearing and review the materials here.
When the Centers for Medicare & Medicaid Services (CMS) announced its vision to modernize Medicare program integrity, Administrator Seema Verma highlighted the agency’s interest in seeking new innovative strategies involving machine learning and artificial intelligence.
Executive Order Directs HHS to use AI to Detect Fraud and Abuse
The announcement came earlier this month and followed an Executive Order by President Trump which urged the Secretary of Health and Human Services (HHS) to direct “public and private resources toward detecting and preventing fraud, waste, and abuse, including through the use of the latest technologies such as artificial intelligence.”
Medicare Fraud Estimated between $21 and $71 Billion Annually
Medicare fraud, waste, and abuse costs CMS and taxpayers billions of dollars.
Artificial intelligence and machine learning could be more cost effective and less burdensome, and can help existing predictive systems designed to flag fraud.
HHS Among Largest Data Producers in the World
In order to understand the potential for AI, CMS also recently issued a Request for Information asking, among other things, if AI tools are being used in the private sector to detect fraud and how AI can enhance program integrity efforts.
But the promise of AI isn’t in just in the CMS data. It’s also in the behaviors of those looking to commit fraud.
According to Jeremy Clopton, director at accounting
consultancy Upstream Academy and an Association of Certified Fraud Examiners
faculty member, the risk of fraud is often described as having three key factors: a perceived pressure
or financial need, a perceived opportunity, and a rationalization of the
Tampa, FL, October 22, 2019 — Lumina, a predictive
analytics company whose AI-driven Radiance platform helps keep people and places
safe and secure through active and early detection of potential risk-related
behaviors, announced today that Robert E. Spring will join as Vice Chairman of
the Board of Managers effective immediately.
“Rob has been a critical contributor as part of our advisory
board, and we welcome him to his new role,” said Lumina CEO and Co-Founder, Allan
Martin. “Rob’s experience in the areas of national security and defense, as
well as corporate risk management will add significantly to our long-term
Spring is a managing director at Gracie Square Capital, LLC,
an investment and consulting firm, and has a long history of involvement with
the National Defense University, including its Center for the Study of Weapons
of Mass Destruction. He is a member of
the Advisory Board of RAND Corporation’s Center for Global Risk and Security
and the Board of the Jamestown Foundation. He has worked with the Defense Science Board
on issues involving the defense industrial base. Additionally, Spring has been involved in
efforts related to veteran suicide prevention and post-traumatic stress
“Lumina’s Radiance platform brings unparalleled
sophistication in helping organizations identify significant threats,” said
Spring. “I look forward to serving as Vice
Chairman, introducing this powerful technology to corporate and governmental institutions,
and helping Lumina fulfill its critical protection mission.”
Today’s announcement follows recent news of two capital
raises by Lumina totaling nearly $6.5 million, expansion of staff in its Tampa
offices and continued investment in sales and marketing efforts to key industry
verticals including education, government, finance and transportation
“Rob has a keen understanding of national security and risk issues,”
said Chairman of the Board Andrew Krusen. “His insights will continue to inform
our go-to-market strategy, and we look forward to working with him in this new
In addition to Krusen, the Lumina Board of Managers includes
former Florida Attorney General and Secretary of State Jim Smith; Jeb Bush, Jr,
Managing Partner at Jeb Bush & Associates; Kathleen Shanahan, co-CEO Turtle
& Hughes and former Chief of Staff to Vice President-elect Dick Cheney and Florida
Governor Jeb Bush. Co-Founders Allan
Martin and Morten Middelfart also serve on the board. Governor Jeb Bush, former Homeland Security
Secretary Michael Chertoff and Charles Allen, former Assistant Director of
Central Intelligence for Collection serve on the Lumina Advisory Board.
Lumina is a predictive analytics company founded on the idea
that technology is a force for good. The company’s optimized artificial
intelligence capabilities help keep people and places safe and secure through
active and early detection of high-risk behavior. Lumina’s Radiance
platform uses proprietary, deep web listening algorithms to uncover risk,
provide timely, actionable information, and help prevent catastrophic loss.
Lumina is committed to protecting what matters most, and its Radiance
platform is designed to help solve the world’s most challenging problems.
“Every online communication between
traffickers, ‘johns,’ and their victims reveals potentially actionable
information for anti-trafficking investigators.”
The study noted the potential for integrating human experts and computer-assisted technologies like AI to detect trafficking online.
AI and human trafficking
Similar research conducted at Carnegie Mellon University looked at how low-level traffickers and organized transnational criminal networks used web sites like Craigslist and Backpage to advertise their victims. The researchers developed AI-based tools to find patterns in the hundreds of millions of online ads and help recover victims and find the bad actors.
The conference brought together researchers, policy makers, social scientists, members of the tech community, and survivors.
One of those researchers – from Lehigh University – is working on a human trafficking project to help law enforcement overcome the challenges of turning vast amounts of data, primarily from police incident reports, into actionable intelligence to assist with their investigations.
Providing better alerts and real risks
Former Federal government officials share the optimism about the power of AI to aid law enforcement in weeding out the criminals and finding the victims.
example, law enforcement can look at young women of a certain age entering the
country from certain high-risk jurisdictions. Marry that up with social media
and young people missing from home, or people associated with a false
employment agency or who think they are getting a nanny job, and you start to
develop a complete picture. And the information can be brought up all at once,
rather than an analyst having to go through the Dark Web.”
To report suspected human trafficking to Federal law enforcement, call 1-866-347-2423.
To get help from the National Trafficking Hotline call 1-888-373-7888 or text HELP or INFO to BeFree (233733).
about AI-powered Radiance and its risk sensing capabilities for issues like human trafficking.
The goal is to understand the underlying factors of suicide, cultivate active engagement with veterans, and increase the timely identification of risk and intervention for those in need.
Increasing Timely Identification and Intervention
A national research strategy is among the key components of PREVENTS. The Office of Science and Technology Policy (OSTP) is tasked with leading efforts to improve the coordination, monitoring, benchmarking, and execution of suicide-related data and research.
In its Request for Information, the OSTP announced its milestones and metrics would be focused on improving the ability to identify individual veterans and groups of veterans at greater risk of suicide and draw upon technology to capture and use health data from non-clinical settings to help target prevention and intervention strategies.
AI as an Early Detection System
believe that machine learning can be part of the solution when it comes to
early intervention and risk prediction, suggesting that AI can be an early
detection system by identifying and monitoring behaviors indicative of suicidal
One study they point to, conducted by researchers at the New York University School of Medicine, and funded by a grant from the U.S. Army Medical Research and Acquisition Activity, used speech-based algorithms to help detect posttraumatic stress disorders (PTSD) from warzone-exposed veterans.
Speech-based Algorithms Help Identify PTSD
The study analyzed audio recordings of clinical interviews, creating
40,526 speech features that were input into an algorithm, and ultimately shaved
down to eighteen specific markers indicative of the potential for PTSD.
The algorithm correctly classified cases 89.1% of the time based on slower,
more monotonous speech characteristics, less change in tonality, and less
variation in activation.
Deep Learning Neural Networks Predict Risk Based on Physicians’ Notes
A collaboration between the VA, the Department of Energy (DOE) and researchers at Lawrence Berkeley National Lab, focused on building deep learning neural networks that could distinguish between patients at high risk and those who are not based on physicians’ and discharge notes.
Among the challenges was the noisy data sets that included
structured data such as lab work and procedures and the unstructured data, like
handwritten notes. But, as one researcher on the project pointed out, the value
is in that unstructured data:
“We believe that, for suicide prevention, the unstructured data will give us another side of the story that is extremely important for predicting risk — things like what the person is feeling, social isolation, homelessness, lack of sleep, pain, and incarceration. This kind of data is more complicated and heterogeneous, and we plan to apply what we have learned …to help VA doctors better decide who is at high risk and who they need to reach out to.”
The Path Ahead
The PREVENT Task Force’s mandate is to submit a proposed
roadmap forward by next March. The possibilities presented by AI and
machine learning suggest that this technology should be a key area of focus,
with continued investment and research.
While early identification of suicidal behaviors and risk is just one piece of helping end this national tragedy, it is a critical component of the overall strategy – and AI can play an important role.
To contact the Veteran Crisis Line, callers can dial
1-800-273-8255 and select option 1 for a VA staffer. Veterans, troops, or their
family members can also text 838255 or visit VeteransCrisisLine.net for assistance.
The National Suicide Prevention Lifeline is 1-800-273-8255.