According to Spot, the idea isn’t to monetize the reports, and for now the startup is focusing on the user experience.
In the future, however, the idea is to sell some sort of AI-driven management system to human resource (HR) departments within companies.
That is the idea behind Hello Cass, a chatbot designed to support people affected by family and sexual violence.
It has been developed by Melbourne-based social enterprise Good Hood, which is currently seeking funding for a pilot of the bot early next year.
Spot then creates certified, private PDF reports from the interview — each entry is time-stamped and constitutes evidence — something that could come in handy if the case escalates to a trial.
The data is stored in an encrypted space until the report is downloaded, at which point it is deleted from the servers.
Ms Koster said Hello Cass would help take pressure off frontline telephone support services."Currently we can't get enough people on the phones, so the wait times for people calling a crisis line can be up to 20 to 30 minutes."Ms Koster said Hello Cass provided a way for people to ask about issues they may find difficult to talk about on the phone.
The 19-year-old tech marketer alleges she was fired from Ripcord in retaliation for reporting a fellow employee to human resources.
“It’s incredibly easy to forget things or misremember them,” added Shaw, “which is why it’s important to record them when the memories are fresh.” It’s interesting to note that the concept for Spot came out of AI startup studio All Turtles, which was co-created by Evernote cofounder and serial entrepreneur Phil Libin (Evernote is a note-taking app).
Daniel Nicolae and Dylan Marriott, both former Evernote employees, joined All Turtles last June to begin working on Spot.
Also involved is the Victorian CASA Forum, which is the peak body for Centres Against Sexual Assault, and No To Violence incorporating men's referral service.
Ms Koster said having Hello Cass available via SMS meant people could chat to her even if they did not have an internet-enabled smartphone.
“A bot can’t judge and can’t have unconscious bias,” said Shaw. If users have questions about whether they should report certain behaviors or actions, the bot goes through definitions of what constitutes harassment and explains the process involved in reporting such incidents.