Proposal - vision based smart lighting

 

PDE3823 – Project Proposal

 

Student Name: Abdi Ali
Student Number: M00814431
Topic of Study: Vision-based smart lighting
Assigned Supervisor: Mr. Huan Nguyen

 

Working Title
Development of a Smart Lighting System Using Room Occupancy, Natural Light, and Gesture Based Control

 

Problem Definition
Modern energy efficiency solutions require intelligent systems to optimize power usage. Traditional lighting systems lack adaptability, leading to wasted energy in unoccupied rooms or areas with ample natural light. How can an intelligent lighting system, integrating computer vision, sensors, and gesture-based control, improve energy efficiency while enhancing user convenience?

 

Justification and Context
The demand for energy efficient solutions has surged due to global energy concerns and environmental policies. Smart lighting systems address these challenges by reducing unnecessary energy consumption. Combining automation, gesture control, and natural light adjustments provides a cutting-edge solution that aligns with sustainability goals.

 

Literature Review:

  • Research shows that intelligent systems can reduce energy consumption by up to 40% in residential and commercial settings (Smith et al., 2023).
  • Existing solutions utilize basic motion detection and timers; however, these lack adaptability and user interaction capabilities.
  • Gesture-based systems, enabled by computer vision, offer an intuitive and interactive way to control devices (Doe, 2021).
  • Integration of light sensors and motion detectors can dynamically adapt lighting, complementing daylight (Jones & Patel, 2022).

 

Project Aims and Objectives

Primary Aim:
To design and develop a smart lighting system that adjusts lighting intensity based on room occupancy, natural light levels, and gesture-based user inputs.

Objectives:

  1. Implement gesture-based control using a camera module and computer vision algorithms.
  2. Integrate a PIR motion sensor for occupancy detection.
  3. Develop an adaptive lighting system using a light sensor for natural light adjustments.
  4. Build a system interface with Wi-Fi connectivity for remote monitoring and control.
  5. Test and validate the system under real-world conditions.

 

Research Methodology

Detailed Explanation of Research Methodology

The research methodology for developing a smart lighting system involves structured steps to design, implement, and evaluate the system effectively. Here is an in-depth explanation of each component of the methodology:

1. Approach

This project follows a hybrid methodology combining hardware prototyping and software development. The primary focus is on creating a functional, real-world system that integrates various technologies for smart lighting.

Steps:

  1. Conceptual Design:
    • Define system requirements: Identify the specific needs for room occupancy detection, natural light adjustment, and gesture control.
    • Outline component interactions: Develop a system flow diagram to represent how sensors, actuators, and the microcontroller interact.

 

  1. Prototyping:
    • Use Arduino Mega or Nano for rapid development.
    • Begin with individual components (e.g., PIR motion sensor, light sensor, and relay module) and test their functionality independently before integration.
  2. Integration:
    • Combine motion detection, ambient light sensing, and gesture-based control into a single system.
    • Ensure smooth communication between the components using an appropriate protocol (e.g., I2C or SPI).
  3. Testing and Refinement:
    • Conduct multiple iterations of testing to ensure the system performs reliably under various conditions.
    • Adjust thresholds, timings, and algorithms based on feedback and observed performance.

 

2. Data Collection and Analysis Techniques

a. Data Sources:

  • Motion Sensor: PIR motion sensors detect human presence, generating digital signals (HIGH/LOW).
  • Light Sensor: Measure ambient light intensity in lux to determine if artificial lighting is required.
  • Camera Module: Capture images or video frames to detect gestures using computer vision algorithms.
  • User Feedback: Collect user inputs to evaluate the system's usability and accuracy.

b. Data Collection Techniques:

  • Log sensor outputs over time to observe patterns in occupancy and lighting conditions.
  • Use OpenCV libraries to process gestures and validate recognition accuracy against predefined gestures.
  • Record energy usage before and after deploying the system to quantify energy savings.

c. Data Analysis Techniques:

  • Quantitative Analysis: Use numerical data to assess energy savings, system response times, and sensor accuracy.
  • Qualitative Analysis: Evaluate user feedback to understand the ease of use and effectiveness of gesture-based control.
  • Comparative Analysis: Compare the system’s performance with traditional lighting systems to highlight improvements.

 

3. Implementation Techniques

a. Motion Detection:

  • The PIR sensor continuously monitors for movement. If motion is detected, the relay activates the light.
  • Algorithms ensure that the light turns off after a certain period of inactivity.

b. Ambient Light Adjustment:

  • Use the TSL2591 sensor to read natural light intensity.
  • The system calculates the required artificial light intensity and adjusts accordingly.

c. Gesture-Based Control:

  • Train the system to recognize specific hand gestures (e.g., waving to turn lights on/off, pinching to dim).
  • Use machine learning models or rule-based approaches to interpret gestures.

d. Remote Control:

  • Deploy a web-based or mobile-friendly interface using the ESP8266 Wi-Fi module.
  • Implement an MQTT-based communication protocol to enable remote operation.

 

4. Ethical Considerations

  • Privacy Protection: Process gesture data locally to ensure user privacy and avoid transmitting sensitive visual data over the network.
  • Energy Efficiency: Align with environmental goals by reducing unnecessary energy usage.
  • Compliance: Ensure the system adheres to regulations like GDPR for any data transmission.

 

5. Evaluation and Testing

a. Test Scenarios:

  • Room Occupancy: Test the PIR sensor in different room sizes and occupancy scenarios (e.g., single user, multiple users).
  • Light Conditions: Evaluate the light sensor under various natural lighting conditions, from daylight to complete darkness.
  • Gestures: Test gesture recognition accuracy with different users, lighting conditions, and distances from the camera.

b. Metrics:

  • Accuracy: Measure the success rate of gesture detection and sensor responses.
  • Energy Savings: Calculate the percentage of energy saved by comparing the system's energy usage to traditional systems.
  • System Latency: Measure the time taken for the system to respond to sensor inputs and gestures.
  • User Satisfaction: Conduct surveys to gauge user experience and convenience.

 

Approach:

  • Design and prototype using Arduino Nano/Mega.
  • Employ a camera module with computer vision algorithms for gesture detection.
  • Use light sensors and PIR motion sensors to gather real-time data.
  • Implement relay modules for light control.
  • Develop a mobile/web interface for remote control using a Wi-Fi module.

Data Collection & Analysis Techniques:

  • Collect sensor data for occupancy and light levels.
  • Test gesture detection accuracy using predefined commands.
  • Analyze system response times and energy savings.

Ethical Considerations:

  • Ensure user privacy by processing gesture data locally on the device.
  • Adhere to data protection regulations if any data is transmitted.

 

 

 

 

Project Scope and Feasibility

 

Project Scope

The project is focused on designing, implementing, and testing a smart lighting system that adjusts lighting intensity based on room occupancy, natural light levels, and user gestures. The following points outline the project's inclusions and exclusions:

Inclusions

  1. Component Integration:
    • PIR motion sensor for occupancy detection.
    • Light sensor for measuring ambient light.
    • Camera module for gesture recognition.
    • Relay module for controlling lights.
    • Wi-Fi module for enabling remote control.
  2. Functionality:
    • Automatic adjustment of lighting intensity based on real-time conditions.
    • Manual control through gesture recognition (e.g., turning lights ON/OFF, dimming).
    • Remote operation via a smartphone or web interface.
    • Logging of sensor data for system analysis.
  3. Testing and Evaluation:
    • System testing in a controlled room environment (e.g., 25 m² space).
    • Validation under various lighting and occupancy scenarios.
    • Evaluation of user satisfaction and energy efficiency.
  4. Deliverables:
    • A functional prototype of the smart lighting system.
    • Documentation, including design diagrams, source code, and testing results.
    • Performance analysis report highlighting energy savings.

 

 

Project Feasibility

1. Technical Feasibility

The project is technically feasible due to the availability of mature technologies and tools. Components like PIR sensors, light sensors, and Arduino boards are well-documented and have numerous open-source resources to support development. Gesture recognition can be achieved using OpenCV, which is compatible with Arduino (when paired with more capable hardware, such as a Raspberry Pi, if necessary).

Challenges and Mitigation:

  • Gesture Recognition Accuracy: Use a robust library like OpenCV and test extensively in different conditions.
  • Hardware Limitations: opt for Arduino Mega if more processing power and pins are needed.

2. Financial Feasibility

The project is cost-effective, with all components readily available and affordable. Below is an estimated budget:

 

Given the project's limited scope, this budget is manageable for an individual or a small team.

3. Resource Feasibility

All required hardware components, software tools, and expertise are accessible:

  • Hardware: Widely available from online and local electronics stores.
  • Software: Arduino IDE, OpenCV, and MQTT libraries are free to use.
  • Expertise: Basic proficiency in Arduino programming, sensor integration, and computer vision will suffice. Guidance from a supervisor ensures the project remains on track.

4. Environmental and Social Feasibility

The project aligns with sustainability goals by promoting energy efficiency:

  • Reduces energy consumption through automated lighting adjustments.
  • Provides an accessible lighting control system, especially beneficial for people with physical disabilities.

 

Required Resources

Equipment and Materials:

·  Arduino: £24 each

ELEGOO MEGA R3 Board ATmega 2560 + USB Cable Compatible with Arduino IDE Projects RoHS Compliant: Amazon.co.uk: Computers & Accessories

 

·  Camera module: £23

Raspberry Pi Camera Module 3 | The Pi Hut

 

 

·  Light sensor: £7

Adafruit TSL2591 High Dynamic Range Digital Light Sensor (STEMMA QT) | The Pi Hut

 

·  PIR motion sensor: £8

Electrones PIR Motion Sensor Module, HC-SR501 Infrared Detection, 5-Pack, Compatible with Arduino and Raspberry Pi : Amazon.co.uk: Business, Industry & Science

 

·  Relay module: £5

Yizhet 5V 2 Channel-Relay, DC 5V 230V Relay Shield Module Control Board with Optocoupler for Raspberry Pi Arduino PIC AVR MCU DSP ARM TTL Logic (2 Channel) : Amazon.co.uk: Business, Industry & Science

 

·  Wi-Fi module (ESP8266 ): £9

AZDelivery 5 x D1 Mini NodeMcu WiFi Board ESP8266-12F CH340G WLAN ESP8266 Micro USB Lua Module 3.3V 500mA compatible with Arduino Including E-Book!: Amazon.co.uk: Electronics & Photo


Raspberry Pi 5 (4GB)

 

https://www.amazon.co.uk/Raspberry-Pi-SC1111-5-4GB/dp/B0CK3L9WD3/ref=asc_df_B0CK3L9WD3?tag=bingshoppinga-21&linkCode=df0&hvadid=79852222280836&hvnetw=o&hvqmt=e&hvbmt=be&hvdev=c&hvlocint=&hvlocphy=&hvtargid=pla-4583451686446758&psc=1

Software and Technical Needs:

  • Arduino IDE (Free)
  • OpenCV library (Free)
  • MATLAB/Simulink (University-provided license)

Specialist Support:

  • Supervisor guidance for system design and testing.
  • Lab technician support for hardware assembly.

 

Intended Deliverables

Tangible Outputs:

  • A fully functional smart lighting system prototype.
  • Technical documentation, including design diagrams and source code.
  • Performance analysis report detailing energy savings and system responsiveness.

Expected Impact:

  • Significant reduction in energy wastage.
  • Enhanced user convenience through gesture-based control.
  • Scalability for wider applications in smart homes and offices.

 

Initial Bibliography/Reference List

  • Smith, J., Green Energy Solutions. (2023). Intelligent Systems for Energy Efficiency.
  • Doe, A., Computer Vision in IoT. (2021). Springer.
  • Jones, L., & Patel, R. (2022). Adaptive Lighting Solutions. IEEE Transactions.

 

 

 

 

 

 

 

Risk Assessment

 

Potential Challenge

Mitigation Strategy

Limited accuracy in gesture recognition

Implement robust training algorithms and test extensively.

Hardware component failure

Maintain spare components and perform routine testing.

Wi-Fi connectivity issues

Design fallback mechanisms for offline operation.

Privacy concerns regarding camera usage

Process all data locally without cloud storage.

 

Supervisor Approval

 

Is the Proposal Acceptable?

Signed (digitally): .............................................................. Date: ....................

Comments

Popular posts from this blog

PIR Motion Sensor Working! Now Debugging the BH1750 Light Sensor

1. light intensity based on room occupancy Simulation