HOUSE BEAVER

A water leak detector with machine learning
Invented in Dyson lab /
Dyson library, Imperial College Rd, South Kensington, London SW7 9EG
Device, 2021
l 9,000 x w 3,200 x h 4,000 mm (l 9.8 x w 3.5 x h 4.4 yd)
House Beaver: Nano 33 BLE(Arduino), NeoPixel, Sound Level Meter [SEN0232], Dc-Dc adapter, Lithium battery
Beaver Dam: Nano 33 BLE(Arduino), directional microphones, 360-degree continuous servo motor, Camera, LED lights

This device was developed to easily prevent and find water leaks at home. Leakage of water at home causes waste of scarce essential resources, pollution of land and water, personal property damage, and disputes between neighbours. In particular, serious disputes such as murder have reportedly occurred in Northeast Asia, where there are many apartments and no related insurance. Contrary to general thoughts, this does not happen only from the upper house to the lower house. In the case of multi-family houses, leaking water may come from unexpected places through gaps in the walls. However, the only way for ordinary people to find this is to rely on experts with expensive equipment. In particular, despite the serious damage caused by water leaks in developing countries, there is no way to prevent this. The House Bieber is information accumulated through machine learning technology that visually shows users the sound of water leakage and not the sound of water leakage.

.

.

Features

.

.

.

Why should this have been developed?

.

.

User journey

.

.

Development process

Step 1. Gathering the water leak sounds -> Step 2. Creating libraries for machine learning ->

Step3-1. Making physical prototype / Circuit diagram -> Step 4. Coding -> Step 5. User test

.

Step 1. Gathering the water leak sounds

Made small holes in PVC pipes, stainless pipes, copper pipes, and flexible tubes. Prepared boxes made of cement, styrofoam, wood, and plastic. The pipes and tubes are buried inside the box and water is injected to record the sound of water leaking. Collected the sound of water leaks from online and experts.

.

Step 2. Creating libraries for machine learning >>link

.

.

Step3-1. Making physical prototype / Circuit diagram – Portable type (house Beaver)

.

.

.

.

Step 3-2. Making physical prototype / Circuit diagram – Installation type (Beaver Dam)

.

.

Step4. Coding

1. Arduino nano 33ble development environment will be installed. Please refer to the link below.
https://www.arduino.cc/en/Guide/NANO33BLE

2. Add a library that dataizes detection sounds.
Sketch > Include Library > Add.Add ZIP library. -> ei-changsublee-project-1-arduino-1.0.12.zip.
Tool > Libary Manager > Adafruit_NeoPixel search and add

3. Check the annotation at the top of the program, set the board, and proceed with the compilation.

4. Explain the program.
Detect whether the detector board makes a sound similar to the pre-populated sound, and check the size of the surrounding sound together when checking every time.
It delivers the presence or absence of detection and the volume of the sound to the led board.
The led board outputs led when it is delivered that the size of the sound has been delivered and has been detected.

.

Step 4-1. Coding – Portable type (house Beaver)

#define EIDSP_QUANTIZE_FILTERBANK 0
#include <PDM.h>
#include <ChangSubLee-project-1_inferencing.h>
#define DATA 2

#define DECIBAL A0

/** Audio buffers, pointers and selectors */
typedef struct {
int16_t *buffer;
uint8_t buf_ready;
uint32_t buf_count;
uint32_t n_samples;
} inference_t;
static inference_t inference;
static bool record_ready = false;
static signed short sampleBuffer[2048];
static bool debug_nn = false; // Set this to true to see e.g. features generated from the raw signal
int mydB;
void setup()
{
// put your setup code here, to run once:
Serial.begin(115200);
Serial.println(“Edge Impulse Inferencing Demo”);
pinMode(DATA, OUTPUT);
// summary of inferencing settings (from model_metadata.h)
// ei_printf(“Inferencing settings:\n”);
// ei_printf(“\tInterval: %.2f ms.\n”, (float)EI_CLASSIFIER_INTERVAL_MS);
// ei_printf(“\tFrame size: %d\n”, EI_CLASSIFIER_DSP_INPUT_FRAME_SIZE);
// ei_printf(“\tSample length: %d ms.\n”, EI_CLASSIFIER_RAW_SAMPLE_COUNT / 16);
// ei_printf(“\tNo. of classes: %d\n”, sizeof(ei_classifier_inferencing_categories) / sizeof(ei_classifier_inferencing_categories[0]));
if (microphone_inference_start(EI_CLASSIFIER_RAW_SAMPLE_COUNT) == false) {
ei_printf(“ERR: Failed to setup audio sampling\r\n”);
return;
}
}
void loop()
{
//ei_printf(“Starting inferencing in 2 seconds…\n”);
digitalWrite(DATA, LOW);
delay(2000);
ei_printf(“Recording…\n”);
bool m = microphone_inference_record();
if (!m) {
ei_printf(“ERR: Failed to record audio…\n”);
return;
}
ei_printf(“Recording done\n”);
signal_t signal;
signal.total_length = EI_CLASSIFIER_RAW_SAMPLE_COUNT;
signal.get_data = &microphone_audio_signal_get_data;
ei_impulse_result_t result = { 0 };
EI_IMPULSE_ERROR r = run_classifier(&signal, &result, debug_nn);
if (r != EI_IMPULSE_OK) {
ei_printf(“ERR: Failed to run classifier (%d)\n”, r);
return;
}
// print the predictions
//ei_printf(“Predictions (DSP: %d ms., Classification: %d ms., Anomaly: %d ms.): \n”,
// result.timing.dsp, result.timing.classification, result.timing.anomaly);
for (size_t ix = 1; ix < EI_CLASSIFIER_LABEL_COUNT; ix++)
{
Serial.print( result.classification[ix].value);
float Data = result.classification[ix].value;
//int OUT = map(mydB, 150, 900, 0, 255);
Serial.print(“\nOUT: “);
if (Data < 0.50)
{
digitalWrite(DATA, HIGH);
Serial.print(“Cough Detected”);
}
}
//for (size_t ix = 0; ix < EI_CLASSIFIER_LABEL_COUNT; ix++) {
// ei_printf(” %s: %.5f\n”, result.classification[ix].label, result.classification[ix].value);
// }
#if EI_CLASSIFIER_HAS_ANOMALY == 1
ei_printf(” anomaly score: %.3f\n”, result.anomaly);
#endif
}
void ei_printf(const char *format, …) {
static char print_buf[1024] = { 0 };
va_list args;
va_start(args, format);
int r = vsnprintf(print_buf, sizeof(print_buf), format, args);
va_end(args);
if (r > 0) {
Serial.write(print_buf);
}
}
static void pdm_data_ready_inference_callback(void)
{
int bytesAvailable = PDM.available();
// read into the sample buffer
int bytesRead = PDM.read((char *)&sampleBuffer[0], bytesAvailable);
if (record_ready == true || inference.buf_ready == 1) {
for(int i = 0; i < bytesRead>>1; i++) {
inference.buffer[inference.buf_count++] = sampleBuffer[i];
if(inference.buf_count >= inference.n_samples) {
inference.buf_count = 0;
inference.buf_ready = 1;
}
}
}
}
static bool microphone_inference_start(uint32_t n_samples)
{
inference.buffer = (int16_t *)malloc(n_samples * sizeof(int16_t));
if(inference.buffer == NULL) {
return false;
}
inference.buf_count = 0;
inference.n_samples = n_samples;
inference.buf_ready = 0;
// configure the data receive callback
PDM.onReceive(&pdm_data_ready_inference_callback);
// optionally set the gain, defaults to 20
PDM.setGain(80);
//ei_printf(“Sector size: %d nblocks: %d\r\n”, ei_nano_fs_get_block_size(), n_sample_blocks);
PDM.setBufferSize(4096);
// initialize PDM with:
// – one channel (mono mode)
// – a 16 kHz sample rate
if (!PDM.begin(1, EI_CLASSIFIER_FREQUENCY)) {
ei_printf(“Failed to start PDM!”);
}
record_ready = true;
return true;
}
static bool microphone_inference_record(void)
{
inference.buf_ready = 0;
inference.buf_count = 0;
mydB = 0;
analogWrite(DATA, 125);
while(inference.buf_ready == 0) {
delay(10);
}
analogWrite(DATA, 0);
return true;
}
static int microphone_audio_signal_get_data(size_t offset, size_t length, float *out_ptr)
{
arm_q15_to_float(&inference.buffer[offset], out_ptr, length);
return 0;
}
static void microphone_inference_end(void)
{
PDM.end();
free(inference.buffer);
}

 

.

Step 4-2-1. Coding – LED for Installation type (Beaver Dam)

#include <Adafruit_NeoPixel.h>

#define PIN 6
#define NUMLED 12

//Create an object to use a Neofixel.
//The first factor is the number of LEDs in the neofixel.
//The second factor is Arduino’s pin number connected to the Neofixel.
//The third factor is the flag that changes depending on the type of Neofixel.
Adafruit_NeoPixel strip = Adafruit_NeoPixel(NUMLED, PIN, NEO_GRB + NEO_KHZ800);
uint32_t myColor[24] = {strip.Color(255, 0, 0), strip.Color(0, 255, 0), strip.Color(0, 0, 255), strip.Color(255, 0, 0), strip.Color(0, 255, 0), strip.Color(0, 0, 255),
strip.Color(255, 0, 0), strip.Color(0, 255, 0), strip.Color(0, 0, 255), strip.Color(255, 0, 0), strip.Color(255, 0, 0), strip.Color(255, 0, 0),
strip.Color(255, 0, 0), strip.Color(255, 0, 0), strip.Color(255, 0, 0), strip.Color(255, 0, 0), strip.Color(255, 0, 0), strip.Color(255, 0, 0),
strip.Color(255, 0, 0), strip.Color(255, 0, 0), strip.Color(255, 0, 0), strip.Color(255, 0, 0), strip.Color(255, 0, 0), strip.Color(255, 0, 0)};

int myData = 0;
int flag = 1;
void setup() {
strip.begin(); //네오픽셀을 초기화하기 위해 모든LED를 off시킨다
strip.show();
pinMode(3,INPUT);
pinMode(4,INPUT);

Serial.begin(115200);
Serial.println(“Edge Impulse Inferencing Demo”);
}

void loop() {
//Repeat NeoPixel in the order below.

while(digitalRead(3))
{
int a = analogRead(A1);
myData = myData>a ? myData : a; // 0~ 1023
flag = 1;
}
//Serial.println(myData);
if(digitalRead(4) && flag)
{
Serial.print(“detect”);
Serial.println(myData);
flag = 0;
int Lv = 0;
if(myData < 200) //0~49 0step
{
Lv = 0;
}
else if(myData < 250) //50~89 1step
{
Lv = 1;
}
else if(myData < 300) //90~119 2step
{
Lv = 2;
}
else if(myData < 350) //120~140 3step
{
Lv = 3;
}
else if(myData < 400) //140~154 4step
{
Lv = 4;
}
else if(myData < 430) //155~167 5step
{
Lv = 5;
}
else if(myData < 460) //170~179 6step
{
Lv = 6;
}
else if(myData < 480) //180~189 7step
{
Lv = 7;
}
else if(myData < 500) //190~199 8step
{
Lv = 8;
}
else if(myData < 550) //190~199 9step
{
Lv = 9;
}
else if(myData < 600) //190~199 10step
{
Lv = 10;
}
else if(myData < 650) //190~199 11step
{
Lv = 11;
}
else if(myData < 700) //190~199 12step
{
Lv = 12;
}

myData = 0;

for(uint16_t i=0; i<Lv; i++) {
strip.setPixelColor(i, strip.Color(255, 255, 0));
strip.show();
}
for(uint16_t i=Lv; i<strip.numPixels(); i++) {
strip.setPixelColor(i, strip.Color(0, 0, 255));
strip.show();
}

// for(uint16_t i=0; i<Lv; i++) {
// strip.setPixelColor(i, myColor[i]);
// strip.show();
// }
// for(uint16_t i=Lv; i<strip.numPixels(); i++) {
// strip.setPixelColor(i, myColor[i]);
// strip.show();
// }

Serial.print(“DETECT”);
Serial.println(Lv);
delay(1000);
for(uint16_t i=0; i<strip.numPixels(); i++) {
strip.setPixelColor(i, strip.Color(0, 0, 255));
strip.show();
}
}
}

 

.

Step 4-2. Coding – Installation type (Beaver Dam)

 

#include <Adafruit_NeoPixel.h>
#include <Servo.h>

#define PIN 6
#define NUMLED 24
#define RSIG 4
#define DSIG 5

int dataA[110];
int dataB[110];
int dataC[110];
int dataD[110];
byte itemSize = sizeof(dataA[0]);
byte itemCount = sizeof(dataA) / itemSize;
int pos1 = 0;
int pos2 = 0;

//Create an object to use a Neofixel.
//The first factor is the number of LEDs in the neofixel.
//The second factor is Arduino’s pin number connected to the Neofixel.
//The third factor is the flag that changes depending on the type of Neofixel.
Adafruit_NeoPixel strip = Adafruit_NeoPixel(NUMLED, PIN, NEO_GRB + NEO_KHZ800);
Servo myservo1;
Servo myservo2;
uint32_t myColor[8] = {strip.Color(0, 0, 0), strip.Color(255, 0, 0), strip.Color(255, 140, 0), strip.Color(255, 255, 0),
strip.Color(0, 255, 0), strip.Color(0, 0, 255), strip.Color(75, 0, 130), strip.Color(128, 0, 128)};
int myData = 0;
int flag = 0;
int myData1;
int myData2;
int myData3;
int myData4;
void setup() {
strip.begin(); //네오픽셀을 초기화하기 위해 모든LED를 off시킨다
strip.show();
pinMode(RSIG,INPUT);
pinMode(DSIG,INPUT);
myservo1.attach(9);
myservo2.attach(10);
Serial.begin(115200);
Serial.println(“Edge Impulse Inferencing Demo”);

for(uint16_t i=0; i<strip.numPixels(); i++)
{
strip.setPixelColor(i, myColor[0]);
strip.show();
}

myservo1.write(0);
myservo2.write(180);

}

void loop() {
//아래의 순서대로 NeoPixel을 반복한다.

memset(dataA, 0, sizeof(dataA));
memset(dataB, 0, sizeof(dataB));
memset(dataC, 0, sizeof(dataC));
memset(dataD, 0, sizeof(dataD));
int i = 0;

while(digitalRead(RSIG))
{
dataA[i] = analogRead(A0);
dataB[i] = analogRead(A1);
dataC[i] = analogRead(A2);
dataD[i++] = analogRead(A3);
delay(10);
flag = 1;
}

if(flag == 1)
{
qsort(dataA, itemCount, itemSize, sort_asc);
qsort(dataB, itemCount, itemSize, sort_asc);
qsort(dataC, itemCount, itemSize, sort_asc);
qsort(dataD, itemCount, itemSize, sort_asc);
myData1 = (dataA[2]+dataA[3]+dataA[4]+dataA[5]+dataA[6])/5;
myData2 = (dataB[2]+dataB[3]+dataB[4]+dataB[5]+dataB[6])/5;
myData3 = (dataC[2]+dataC[3]+dataC[4]+dataC[5]+dataC[6])/5;
myData4 = (dataD[2]+dataD[3]+dataD[4]+dataD[5]+dataD[6])/5;
flag=2;
}

//Serial.println(myData);
if(digitalRead(DSIG) && flag==2)
{
Serial.println(“detect”);
Serial.print(myData1);
Serial.print(“, “);
Serial.print(myData2);
Serial.print(“, “);
Serial.print(myData3);
Serial.print(“, “);
Serial.println(myData4);
flag = 0;
int Lv = 0;
int a = myData1 > myData2 ? myData1 : myData2;
int b = myData3 > myData4 ? myData3 : myData4;
myData = a > b ? a : b;

int dir = 0;
if(myData == myData1)
dir = 3;
else if(myData == myData2)
dir = 4;
else if(myData == myData3)
dir = 1;
else if(myData == myData4)
dir = 2;

if(myData < 100) //0~49 0step
{
Lv = 0;
}
else if(myData < 150) //50~89 1step
{
Lv = 1;
}
else if(myData < 200) //90~119 2step
{
Lv = 2;
}
else if(myData < 250) //120~140 3step
{
Lv = 3;
}
else if(myData < 300) //140~154 4step
{
Lv = 4;
}
else if(myData < 350) //155~167 5step
{
Lv = 5;
}
else if(myData < 400) //170~179 6step
{
Lv = 6;
}
else if(myData < 450) //180~189 7step
{
Lv = 7;
}
myData = 0;

Serial.print(“LV: “);
Serial.print(Lv);
Serial.print(” dir: “);
Serial.print(dir);
if(dir==1)
{
myservo1.write(180);
myservo2.write(140);
for(uint16_t i=0; i<5; i++)
{
strip.setPixelColor(i, myColor[Lv]);
strip.show();
}
}
else if(dir==2)
{
myservo1.write(105);
myservo2.write(180);
delay(10);
for(uint16_t i=7; i<12; i++)
{
strip.setPixelColor(i, myColor[Lv]);
strip.show();
}

}
else if(dir==3)
{
myservo1.write(0);
myservo2.write(180);
for(uint16_t i=13; i<18; i++)
{
strip.setPixelColor(i, myColor[Lv]);
strip.show();
}
}
else if(dir==4)
{
myservo1.write(180);
myservo2.write(60);
for(uint16_t i=19; i<24; i++)
{
strip.setPixelColor(i, myColor[Lv]);
strip.show();
}
}
delay(1000);
for(uint16_t i=0; i<strip.numPixels(); i++) {
strip.setPixelColor(i, strip.Color(0, 0, 0));
strip.show();
}
}
}

int sort_asc(const void* item1, const void* item2)
{ //compare int and assume it and cast it as int
int a = *((int*)item1);
int b = *((int*)item2); return b – a;
}

 

Step 5-1. User test – Portable type (house Beaver)

.

Step 5-2. User test – Installation type (House Dam)

.

.

Previous Research

.

Severe soil pollution in Shanxi Province, China >>link

Residents who drank contaminated groundwater for five years sued construction companies for damages. >>link

Law firm City and Human Environment Center >>link

Pipe Age and Leakage Rates Across the U.S. Call for Infrastructure Upgrades

Almost two trillion gallons of water per year are lost to leaks, about 15% of the total drinking water treated in the U.S.  >> link

Managing water for all

Leakage in well-run water utilities in OECD countries is usually in the range of
10%-30% of water production. In developing countries, it frequently exceeds 40%, and
sometimes reaches 70%. As a result, significantly more water needs to be produced and
transported than finally reaches the consumer. >>link

What About Pipe Leaks?

Pipe leaks, although less annoying or obvious, are much more serious and expensive than leaking faucets. On average, a pipe leak the size of the tip of a pencil will waste approximately 970 gallons in 24 hours at even low water pressure (this calculation is made using 40psi in water pressure; that water pressure level is low for homes in the Charlotte area). >>link

.

.

The keywords of the survey with 80 respondents and 10 stakeholders

.

  • Qurrel or dispute with neighbors
  • Financial damage
  • No insurance
  • floor heating system
  • Apartment
  • Willing to buy the water leak detector
  • Be reasonable over £64.
  • Should be made mandatory
  • Causes electrical problems
  • Contaminates soil and water

.

.

Expected Solution

.