BEE-L Archives

Informed Discussion of Beekeeping Issues and Bee Biology

BEE-L@COMMUNITY.LSOFT.COM

Options: Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Subject:
From:
Jerry Bromenshenk <[log in to unmask]>
Reply To:
Informed Discussion of Beekeeping Issues and Bee Biology <[log in to unmask]>
Date:
Fri, 24 May 2019 15:54:32 +0000
Content-Type:
text/plain
Parts/Attachments:
text/plain (26 lines)
Peter
Our app is fundamentally different from others  - most of which started up after we presented our first prototypes at an international workshop in Vermont.
The songs of bees are far more complex than just simple frequency and amplitude shifts that an experienced beekeeper or research scientist can discern - i.e., queen loss (most experienced beekeepers), swarming (Eddie Woods, U.K.), varroa mites (Jim Bach, WA), versus EHB (Howard Kerr, TN).  
Several of the scale/monitoring hives add a microphone and display some form of sonogram, and some competing smartphone  apps have charts and gauges (one has a retro- Steam Punk) look.   The majority of these rely on the beekeeper to interpret.  One of the smartphone apps has regional distributors who hold workshops (for a fee) to instruct users.
Eddie patented his device - a suitcase-style system with microphones and sold them in U.K.   Howard did the same, had a little hand-held device that one put a captured bee inside, let fly toward sun in clear capsule, and little lights in the handle lit up green, yellow, or red.  I describe these and  our own 10 year R&D in an online, peer-reviewed paper ( https://www.mdpi.com/2079-6374/5/4/678).
Eddie set the pattern for most of the European approaches, and after our presentation in 2012, significant investments were made in terms of funding acoustic research in Europe.  All of this has resulted in others realizing that there's more to honey bee sounds than we knew.   Remember, it's only recently that it was disovered that bees have vibration sensors (whether one chooses to call that sound detection is arguable).
You know that Tom Seeley has continued to identify specific bee to bee sounds.
We differ from others in some significant ways:   We've been working on bee sounds since 2005.  We were nominated for a national innovation award by the  US military for our work using bee sounds  as a rapid alert system for toxic  chemicals.  We were able to detect exposure to minute quantities of a variety of chemicals in seconds, and we were able to distinguish the type of chemical by the specific sounds emitted from exposed colonies (see results in our Biosensors paper).  We have shown that we can discriminate different sounds to a variety of chemical and biological variables, first based on statistical analysis of sonograms, and later using AI-powered analyses.  We hold patents for our system in the U.S. and Canada.
Comparing our AI-powered analysis, that can be 'trained' and tuned for performance to a beekeeper/scientist ear or their ability to look at some form of sonogram and reliably interpret it, is like comparing a Model T to a Tesla.
We approach bee sounds in a manner nearly identical to the speech recognition of Alexa, Siri, Cortana, or Nuance (Dragon-speaking  naturally).

James Baker laid out the approach in 1975.  He first released his software in 1982.   Many of us remember it as somewhat useful.   Over the years, it got better and better.  He and his wife fundamentally changed how speech recognition is  accomplished.   The newest versions can be fine - tuned to regional accents.  Scott Debnam, who has been with my team for a long time, is from the south.  A couple of years ago, he tried it for field notes.  The general version was ok, the one for a Georgia Shift got it right.
Dragon (Nuance) changed how speech recognition was accomplished.  I remember labs full of students at UM trying to dissect sonograms to get at the essence of speech - with little success.  AI approaches like Dragon's changed the whole approach.
Similarly, we started with statistical analyses of sonograms, but that was cumbersome.  Robert Seccomb, another of my partners, pioneered the use of AI for analysis of bee behaviors (his M.S. theses).   He built the AI under our app.
Like Dragon, our AI-powered app can learn.  Like Dragon, the first iterations required lots of input that will be processed by high-speed computers, even ensembles of computers, to TUNE  the app.  That's  what our Kickstarter is all  about.   We expect it to be so-so initially, but as our citizen science backers/users upload their recordings/app analyses/inspection reports, we will re-train the app and issue automated updates.  We saw progressive improvement on our benchtop and initial hand-held systems,  The progression of improved performance and accuracy mirrors the path of improvement seen by Dragon with more and more users.  And like Dragon, in the early stages, we have to do the re-training from our Cloud data and update the app.   With a large enough user group and hopefully a lot shorter time span than Dragon's 1982-2019 maturation, we will have a Dragon equivalent  for bees in a couple of years.  

In the meantime, even if the acoustic part of the app takes time to TUNE, the inspection reports, which are based on direct, classical, visual colony inspections - but with a simplified and easy to complete drop-down survey form that is embedded in the app, will provide immediate GPS, Date, Time data for flagging and providing alerts and mapping of bee health issues.
Peter, I'm as old or older than you.   I don't have the 37 years that it took for Dragon to mature and spin-off self-learning apps like those in Alexa, Siri, etc.
We waited this long because smartphones were too slow to conduct our AI-powered analyses in a reasonably short time.  A few years ago, it took 20 minutes.   As of the 8 version of  Android and iPhone software, it takes 15 seconds of less!   It was time to move forward before I croaked of old age.  Jerry
Final thoughts - it's in the hands of beekeepers.  If we get a large enough team, get scientists who have known/controlled experiments such as mite farms, Small Hive Beetle trials, Nosema research, where colonies are recorded and characterized for levels of infections or infestations, find bee inspectors who will record any 'sick' colonies as well as healthy colonies, get commercial beekeepers to do the same AND HIT the upload to the CLOUD button, we can rapidly TUNE our app for performance and accuracy.   Note, all of our participants simply send us recordings, app analysis, and a visual inspection report.  They do not analyze/interpret the data.  The app does, and it will be updated by us based on our TUNING of the  app, using the data provided.  Like Dragon, what initially will be done  centrally should eventually lead to self-learning on the  phone versions of our app.

             ***********************************************
The BEE-L mailing list is powered by L-Soft's renowned
LISTSERV(R) list management software.  For more information, go to:
http://www.lsoft.com/LISTSERV-powered.html

ATOM RSS1 RSS2