Chinstrap penguins are members of Antarctica’s brush-tailed group of penguins. They’re easily identified by the feature that gives them the...
- Advertisement -
Chinstrap penguins are members of Antarctica’s brush-tailed group of penguins. They’re easily identified by the feature that gives them their name – a black strap that runs from ear to ear below the chin.
The species is found mostly in the Western Antarctic Peninsula region, on far-flung isles such as the South Shetland Islands, the South Sandwich Islands and the South Orkney Islands.
Chinstrap penguins are highly specialised predators, feeding on marine crustaceans called Antarctic krill. The birds are still very abundant (estimates suggest there are between 3 million and 4 million breeding pairs).
Chinstrap penguins are highly specialised predators, feeding on marine crustaceans called Antarctic krill. The birds are still very abundant (estimates suggest there are between 3 million and 4 million breeding pairs).
But many of their colonies are unfortunately experiencing population declines. The trend may be linked to krill becoming less available because of climate change, increasing populations of other marine predators (like baleen whales, which also eat krill) and commercial krill fishing.
So, it’s important to understand how much krill chinstrap penguins and other marine predators are consuming. This can help scientists to predict future population trends and inform conservation and ecosystem management strategies.
It’s challenging to observe directly how penguins catch their underwater prey in areas of remote ocean habitat. However, thanks to innovations in technology that have allowed for ever more powerful remote monitoring, our understanding of their foraging behaviour has rapidly grown during the last decades.
We are part of a team of researchers that recently published a study underpinned by just such a technological innovation. Using animal-borne video and movement sensor data to train machine learning algorithms, we were able to quantify how much krill chinstrap penguins catch.
So, it’s important to understand how much krill chinstrap penguins and other marine predators are consuming. This can help scientists to predict future population trends and inform conservation and ecosystem management strategies.
It’s challenging to observe directly how penguins catch their underwater prey in areas of remote ocean habitat. However, thanks to innovations in technology that have allowed for ever more powerful remote monitoring, our understanding of their foraging behaviour has rapidly grown during the last decades.
We are part of a team of researchers that recently published a study underpinned by just such a technological innovation. Using animal-borne video and movement sensor data to train machine learning algorithms, we were able to quantify how much krill chinstrap penguins catch.
Diving with Penguins: Tech Gives Ocean Scientists a Bird’s-eye View |
We used “deep learning”, a subset of machine learning, to detect the penguins’ feeding events. In our study, these algorithms not only performed classification tasks faster than human observers would be able to, but also detected patterns in the data that were difficult to observe visually.
To date, estimates of krill consumption by penguins have typically been derived from bio-energetic models which are based on principles of physiology like metabolic rates and how energy is assimilated from food. These estimates often can’t be empirically validated. Another antiquated method, stomach flushing, is highly invasive.
Animal-borne sensors provide continuous, high-resolution data on movement and behaviours, allowing large amounts of data to be recorded. But all that data needs to be analysed, which is not an easy task for humans. Machine learning algorithms can rapidly process these large datasets.
Decoding penguin foraging
We camped at the South Orkney Islands in January 2022 and January 2023 to collect data for this study. We used waterproof tape to attach miniature video cameras and tags with acceleration and pressure sensors to the backs of penguins breeding on the islands.
Each penguin collected data for a single foraging trip at sea, usually lasting less than a day. We removed the loggers when the penguins returned to their nests to feed their chicks.
To attach and to remove the devices, we caught the nesting penguins by hand, blindfolded them with a soft cloth hood and restrained them (in our hands) for a few minutes. The short handling times, and the small size of the tags, make any potential negative effects from this process unlikely.
The video footage allowed us to visually confirm each instance where the penguins were catching krill. The other sensors measured the penguins’ diving depths and the dynamics of movement (acceleration in three axes – surge, sway and heave, which allows body pitch and roll to be identified – at 25 data points per second).
Because the video, acceleration and depth data were time-synchronised we could use the video observations to identify and label the snippets of acceleration and depth data that corresponded to prey captures. Machine learning models were then trained on the accelerometer data, using the labels as instances where prey captures occurred.
The results showed that the machine learning models we trained with labelled data are able to identify prey capture events from new acceleration and depth data with high accuracy.
What’s exciting is that the machine learning model can now work in the absence of video data, identifying prey capture events from new acceleration and depth data. In future, we can therefore use a single acceleration and depth bio-logging tag per bird to obtain information on prey captures in this species.
Obtaining foraging information without video is preferable for monitoring purposes, since video cameras only record a few hours before depleting their batteries, whereas acceleration and depth can be measured over many days.
To attach and to remove the devices, we caught the nesting penguins by hand, blindfolded them with a soft cloth hood and restrained them (in our hands) for a few minutes. The short handling times, and the small size of the tags, make any potential negative effects from this process unlikely.
The video footage allowed us to visually confirm each instance where the penguins were catching krill. The other sensors measured the penguins’ diving depths and the dynamics of movement (acceleration in three axes – surge, sway and heave, which allows body pitch and roll to be identified – at 25 data points per second).
Because the video, acceleration and depth data were time-synchronised we could use the video observations to identify and label the snippets of acceleration and depth data that corresponded to prey captures. Machine learning models were then trained on the accelerometer data, using the labels as instances where prey captures occurred.
The results showed that the machine learning models we trained with labelled data are able to identify prey capture events from new acceleration and depth data with high accuracy.
What’s exciting is that the machine learning model can now work in the absence of video data, identifying prey capture events from new acceleration and depth data. In future, we can therefore use a single acceleration and depth bio-logging tag per bird to obtain information on prey captures in this species.
Obtaining foraging information without video is preferable for monitoring purposes, since video cameras only record a few hours before depleting their batteries, whereas acceleration and depth can be measured over many days.
Guiding conservation
Our hope is that the method we developed can be used to monitor temporal and spatial changes in how much chinstrap penguins eat, to help guide conservation and ecosystem management in the Southern Ocean around Antarctica. - The Conversation
- Advertisement -
- Advertisement -
Tinzwei Is A Worth Voyage For Those In Pursuit For Up-To-Date World Events.
Read More At The Online Coronavirus Portal Or Use The 24-Hour Public Hotline:
South Africa: 0800 029 999 or just Send Hie to 0600 123 456 on WhatsApp
No comments