Autonomous Camera Traps for Insects / Feed

Camera trapping for insects is becoming a reality using advances in camera, AI, and autonomous systems technologies. This group discusses the latest advances, shares experiences, and offers a space for anyone interested in the technology, from beginners to experts.


Insect camera traps for phototactic insects and diurnal pollinating insects

Hello, we developed an automated camera trap for phototactic insects a few years ago and are planning on further developing our system to also assess diurnal pollinating...

11 0

Hi Sarah, 

I will definitely link things once I start publishing. Please let me know how the different colours work for you. 

Good luck!


Hi Lars,

We had a couple of test colours in the field for about six weeks. Some look quite good, others are nearly gone. I would recommand to do some pre-testing before you place them in the field for longer and better respay every year. However, the colours I recommanded before still look good, although I should test if they have lost some of their fluorescence.

Best regards,


See full post

Capture And Identify Flying Insects

Hello EveryoneI already found a lot of helpful information on this page though I am having a hard time finding a system which is confirmed to be able to identify insects that fly...

2 0

This sounds like an interesting challenge. I think depth of focus and shutter speeds are going to be challenging. You'll need a fast shutter speed to be able to get shape images of insects in flight. Are you interested in species ID or are you more interested in abundance. having a backboard on the other side of the hotel would be a good idea to remove background clutter from your images.

Hi there,

I am also trying to get some visuals from wildlife cameras of insects visiting insect hotels. Was wondering if you had gained any further information on which cameras might be used for testing this?


See full post

Welcome to the Autonomous camera traps for insects group!

Hello and welcome to the Autonomous Camera Traps for Insects group :).In this group we will be discussing the use of autonomous camera traps as a tool for long-term remote...

15 5

Hi all,

My name is Peter van Lunteren and I’m a wildlife ecologist from The Netherlands. I’m not really an entomologist nor do I work much with insects - I’m just an enthusiast ;) Tom’s video on The WILDLABS Variety Hour has motivated me to join this discussion group.

I have made an open source application called EcoAssist where people can train and deploy YOLOv5 object detection models. It started out as a GUI for the MegaDetector model to filter out empty camera trap images, but evolved to become a platform where ecologists can transfer knowledge from existing models to easily train their own species classifiers.

I was wondering whether it would be possible to include an insect detection model to the application, so that people can use it as a starting point to train their own insect model. Unfortunately, though, it is limited to YOLOv5 models. Does anyone know of a YOLOv5 insect detection model?

Cheers and keep up the good work,


Hi Peter,

EcoAssist looks really cool! It's great that you combined every step for custom model training and deployment into one application. I will take a deeper look at it asap.

Regarding YOLOv5 insect detection models:

  • Bjerge et al. (2023) released a dataset with annotated insects on complex background together with three YOLOv5 models at Zenodo.
  • For a DIY camera trap for automated insect monitoring, we published a dataset with annotated insects on homogeneous background (flower platform) at Roboflow Universe and at Zenodo. The available models that are trained on this dataset are converted to .blob format for deployment on the Luxonis OAK cameras. If you are interested, I could train a YOLOv5 model with your preferred parameters and input size and send it to you in PyTorch format (and/or ONNX for CPU inference) to include in your application. Of course you can also use the dataset to train the model on your own.


See full post

Testing Raspberry Pi cameras: Results

So, we (mainly @albags ) have done some tests to compare the camera we currently use in the...

1 0

Hi Tom and Alba,

great comparison! From what distance did you take the images?

It would be interesting to also check the respective power consumption of the cameras while recording images and the latency (how long does it take to save the image in full resolution) on RPi 4/Zero. Also I'm wondering how the slightly improved image quality (e.g. between Pi camera 3 and IMX519) would actually affect classification accuracy during deployment. I'm pretty sure that there will be no difference regarding only detection, and the impact on classification probably depends on the differences between classes.

From some tests that I did with a YOLOv5s-cls classification model, trained on our classification dataset (only group level), I got the feeling that even significant downscaling (50%, so only half of the pixels are "kept" from the original image) will only lead to a minor decrease in classification accuracy. Of course this will depend heavily on the dataset and more classes with finer details/differences will probably be classified with a higher accuracy if you use a higher resolution of the input images.

There is definitely still a lot of potential for testing different recording hardware and its impact on automated detection/classification, especially regarding the trade-off between accuracy, speed, power consumption and disk storage.


See full post

What is the best light for attracting moths?

We want to upgrade the UV lights on our moth traps. We currently use a UV fluorescent tube...

5 0

We have also thought about these sorts of things. We have chosen to keep the light on continuously for the night, but turn it off before dawn to allow the moths to fly away before predators arrive. 

We are going to be trying out the EntoLEDs and LepiLEDs in Panama in the last two weeks of January, I'll post here on my thoughts.

Would be great to hear more. We found that the lepiLED was great! The ento mini did not attract as much, but if compensated with many nights of deployment it would probably work okay.

See full post

Hack a momentary on-off button 

I have several very bright UV LEDs that I bought for cheap online that are built into nice housings of UV curing lamps and flashlights that can already be automatically powered by...

5 0



Hi @hikinghack

If I am understanding correctly, you want to be able to have the UV lights come on and go off at a certain time (?) and emulate the button push which actually switches them on and off? Is the momentary switch the little button at the top of the image you attached? Is it going to be cotrolled by a timer or a microcontroller at all? Sorry for all the questions, but I am not 100% clear on exactly what you are after. In the meantime, I've linked to a pretty decent tutorial on the process of hacking a momentary switch with a view to automating it with an Arduino microcontroller board, although it sort of assumes a bit of knowledge of electronics (e.g. MOSFETS/transistors) in certain places.  

Alternatively, this tutorial is also good, with good explanations throughout:

If neither of these help, let me know and there might be some easier instructions I can put together. 

All the best,


Hi Andrew,

If I understand you correctly, you want to turn on the LEDs when USB power is applied.  The easiest way I can see to do this is to reroute the red wire to USBC VBUS, via an appropriate current limiting resistor.  This bypasses all the electronics in your photo.

You could insert the current limiting resistor in the USB cable for better heat dissipation, or use a DC-DC constant current source instead of a resistor if power consumption is a concern.

Further to @htarold 's excellent suggestion, you can replace that entire PCB with a simple USB breakout board (e.g. USB micro attached below) by removing the red wire and attaching it to VCC on the breakout board, and removing and attaching the black wire to GND. 

See full post

Who's going to ESA in Portland this year?

So who is going to be at the Ecological Society of America meeting this year in Portland, August 6-11th?It will be my first...

4 3

Indeed, I'll be there too!  I like to meet new conservation friends with morning runs, so I will likely organize a couple of runs, maybe one right near the conference, and one somewhere in a nearby park where we can look for wildlife.  The latter would probably be at an obscenely early hour, so we can drive somewhere, ideally see elk (there are elk within 25 minutes of Portland!), and still get back in time for the morning sessions.

See full post

Project introductions and updates

Tell us about your project!If you are just starting out with autonomous camera traps for insects, or if you are a seasoned expert, this is the place to share your...

27 1

The AMI-Trap

Combining robust lighting for attracting insects with high resolution cameras, the AMI-trap can provide practical and cost-effective solutions for standardised monitoring. AMI-traps have been deployed in the UK, Canada, USA, Cyprus, Panama and Argentina, with plans to expand further.

The AMI-Trap is an iteration of the design first published by Bjerge et al 2021, and has been developed in partnership with those authors as well as researchers in Europe and North America. An open AI-process workflow has been developed at Mila (see an earlier post in this thread), which takes images from the AMI-Trap, locates and tracks individuals, and classifies individuals where possible.

AMI-traps are solar powered and run on a programmable schedule. As a result they can be left out for an entire field season to collect data, however checking every now and then to see if they are okay is recommended! The hardware has been tested underwater and in a 60C oven to unsure robustness in field conditions. The development of the AMI-Trap hardware at UKCEH, and the development of the AI, edge processing, standards and more is funded from a number of projects across the partners involved.

UKCEH can build AMI-Traps for interested researchers, funds generated are spent exclusively on supporting and developing the AMI-trap.

Find out more: 


Hi all! I'm part of a Pollinator Monitoring Program at California State University, San Marcos which was started by a colleague lecturer of mine who was interested in learning more about the efficacy of pollinator gardens. It grew to include comparing local natural habitat of the Coastal Sage Scrub and I was initially brought on board to assist with data analysis, data management, etc. We then pivoted to the idea of using camera traps and AI for insect detection in place of the in-person monitoring approach (for increasing data and adding a cool tech angle to the effort, given it is of interest to local community partners that have pollinator gardens). 

The group heavily involves students as researchers, and they are instrumental to the projects. We have settled on a combination of video footage and development of deep neural networks using the cloud-hosted video track detection tool, VIAME (developed by Kitware for NOAA Fisheries originally for fish track detection). Students built our first two PICTs (low-cost camera traps), and annotated the data from our pilot study that we are currently starting the process of network development for. Here's a cool pic of the easy-to-use interface that students use when annotating data: 

A screenshot of a computer

Description automatically generated with medium confidence

                                                    Figure 1: VIAME software demonstrating annotation of the track of an insect in the video (red box). Annotations are                                                           done manually to develop a neural network for the automated processing.

The goal of the group's camera trap team is develop a neural network that can track insect pollinators associated with a wide variety of plants, and to use this information to collect large datasets to better understand the pollinator occurrence and activities with local habitats. This ultimately relates to native habitat health and can be used for long-term tracking of changes in the ecosystem, with the idea that knowledge of pollinators may inform resources and conservation managers, as well as local organizations in their land use practices. We ultimately are interested in working with the Kitware folks further to not only develop a robust network (and share broadly of course!), but also to customize the data extraction from automated tracks to include automated species/species group identification and information on interaction rate by those pollinators. We would love any suggestions for appropriate proposals to apply to, as well as any information/suggestions regarding the PICT camera or suggestions on methods. We are looking to include night time data collection at some point as well and are aware the near infrared is advised, but would appreciate any thoughts/advice on that avenue as well. 

We will of course post when we have more results and look forward to hearing more about all the interesting projects happening in this space!

Liz Ferguson 

HI, indeed as Tom mentioned, I am working here in Vermont on moth monitoring using machines with Tom and others. We have a network going from here into Canada with others. Would love to catch up with you soon. I am away until late April, but would love to connect after that!

See full post

Camera to follow wasps/attach on wasps

I am researching on the nesting on potter wasps in Bangalore, India. I have to follow nesting wasps to observe all its activities including where it takes rest at night. I am...

5 2

Hi @Lars_Holst_Hansen  @tom_august 

The link to the video is amazing. Thank you for it. 

The wasps that I am working on, are solitary. So, basically it is just this one female that builds the entire nest. Like what you (@tom_august) mentioned, the best option would be to keep a running camera at the nest to record the whole process of nest building. Having one placed inside will be difficult because even if we do work out a way to have lighting inside the nest, the light might be detrimental to the developing larva inside. Hence, it is likely not to be of any benefit.

I am totally smitten by the idea of having a sensor on the wasp body to track where it goes! We could get to know how far it travels to bring the prey and also to collect soil. 

@ShwetaMukundan I just saw this thesis published on tracking bees. Maybe you could use the same method? 

See full post

Exploring storage options for mass data collection

Hi all. I'm currently exploring options for data storage en masse. With our project we will be collecting 24hr hydrophone data, drone video 6hr per day, photography &...

2 0

Hi Adam!

I mostly live within the ecoacoustics space so I'll just speak on the hydrophone part of your request; Arbimon is a free web/cloud-based platform with unlimited storage for audio files. We've got an uploader app as well for mass-uploading lots of files. There's also a bunch of spectrogram visualization/annotation tools and analysis workflows available. It's AWS running under the hood.

I have some experience working directly with AWS & Microsoft Azure, and I've found personally that AWS was more user-friendly and intuitive for the (fairly simplistic) kinds of tasks I've done.  

See full post

Catch up with The Variety Hour: March

Catch up on our March Variety Hour, where we talk about building autonomous camera traps for insects, get an update about the arbimon tool for ecoacoustics, learn about the Biodiversity Accelerator+ which is now open...

See full post

Monitoring airborne biomass

We are producing hard- and software for small scale research  radars to monitor airborne biomass automatically. We can distinguish birds, bats and insects through day and...

2 0

Looks like you want to have a read of this thread: 

Our project in very short is, setting up a sensor network for monitoring airborne biomass, mainly insects, birds and bats in near realtime, and to develop a forecast model to be used for mitigation with respect various types of human-wildlife conflicts (e.g. wind power, pesticide application, aviation). Our expertise is mainly in radar monitoring, but we aim on add insect camera information to be merged with the quantitative biomass measeurments by radar.

See full post

AMI-trap unboxing - Automated moth monitoring system

Thanks so much to @JoeBowden for recording this video of his unboxing of the AMI-Trap (Automated Monitoring of Insects Trap...

2 4

Here is the website where you can find more info about tue AMI trap.

This video is so great - I don't know what I was imagining that you were building, but this is so much bigger and more involved than whatever I was vaguely thinking. Really cool to see! 

Side comment - could we make conservation tech unboxings a thing? 

See full post

Solar panels in the tropics

We are deploying automated systems in the topics and hope to use solar panels, but this closed canopy in most places I'm seeing this as a challenge.Past the obvious: 'find a...

4 0

Hi Tom,

I'm with Akiba, you have to test.  A collaborator has deployed solar-augmented kit in secondary jungle and some of them got enough light, and others didn't, so it can work.  The open circuit voltage of solar panels doesn't change a whole lot in dim light, but the current drops drastically.  So you would choose an oversize panel of the same voltage (or a bit higher).


I've been intrigued by this topic. Thinking about ways you could use drones or some kind of launcher to deploy panels above the canopy. Sadly I live in the great white north so I have no way of testing any concepts. Maybe even some kind of solar balloon that could float above the canopy. Interesting design problem.

Hey Tom,
Since the output is dependent on a couple of factors such as the solar irradiance of the place, shading from the canopy, the type of solar panels (mono, poly or amorphous) and orientation of the panels, etc, I'd suggest you use a software to simulate the different parameters to get an almost accurate estimation of the output. You can try PVsyst- it has a free month trial (I haven't used it before but I hear it's great) or any other PV software :)

See full post

Cameras - pros and cons

So, what makes a good camera for an autonomous camera trap for insects?We use a web camera in our system, which seems to work well a lot of the time, it produces...

12 1

Hi Liz, unfortunately you will still need a Raspberry Pi as host for the OAK-1 camera to reproduce our hardware setup. It's also possible to use another Linux-based system (e.g. Raspberry Pi alternatives), but I didn't test this myself and the setup process will be different to our documentation (and probably not so straightforward). I'm planning to publish the documentation website in the next weeks, but I can already send you detailed information about putting together the hardware if you are still interested.



I'm working on a light weight light trap based on Bjerge et al 2021, however I opted to use an ArduCam 64mp (9152 x 6944 resolution). Designed for the pi specifically and at $60 it checks many of your criteria. I haven't put everything together yet so I can't speak for white balance and power usage, but the autofocus appears to work well from initial tests, and it is tiny.




Awesome! it would be great to hear how you get on, maybe you can share your results here when you have them. Is the camera only for the Pi? That could be a problem for scaling as Pis are quite hard to come by at the moment.

See full post

Metadata standards for Automated Insect Camera Traps

Have others watched this webinar from GBIF introducing the data model for camera trap data. I wonder if this is something we can easily adopt/adapt for our sorts of camera traps?

3 2

I did attend the webinar and had a strong feeling that this standard will be well supported and taken up in the camera trapping community! I would also love to hear if someone has tried to use it.

I've added this to the main camera trap thread as it would be good to get thoughts from those folk too.

Yes. I think this is really the way to go!

See full post

Easy-RIDER project Workshop IV: Pollinator monitoring recording

In case you missed our webinar on Pollinator monitoring, here is the recording.

We had presentations from three teams who will be presenting their work in designing automated monitoring tools for flower-visiting insects, different ways for creating datasets for training machine learning algorithms for insect identification and how these new technologies can be integrated in traditional monitoring schemes. The talks were followed with a discussion session.


Implementation of video surveillance to quantify the predation rate

Hello everyone,First of all, thank you for all the information on your great website. My name is Julien Péters and I am a PhD student at the University of Liège (Belgium). For my...

2 0


We are having this problem too and it might be worthy of its own thread! The lack of RaspberryPis is a big problem and we are currently looking into alternatives. We haven't found one yet, but if we do I will let you know. @Max_Sitt might have some suggested alternatives for his system?

Hi Julien,


we are working with the Luxonis OAK-1 which can run lightweight detection models (e.g. YOLOv5n/s) directly on-device. However you will still need a host, for outdoor deployment Raspberry Pi (e.g. Zero 2 W) is perfect. But for testing you could also use another Linux-based system as host device or just connect it to e.g. your notebook. You can find more info in the Luxonis Docs.


Regarding the Raspberry Pi availability, this blog post from Jeff Geerling probably sums up the current situation pretty well. I hope in Q1 2023 the situation will get better, but at the moment nobody really knows for sure.

See full post

Workshop IV: Pollinator monitoring

This workshop is part of a series of online meetings to share experiences around the globe using automated technology (Cameras + AI) to monitor moths and other nocturnal insects.

2 1
This sounds amazing and I advertised it among my colleagues. Unfortunately, most probably I will not be able to attend, however, would be nice if you can provide the recording...
See full post