<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	
	>
<channel>
	<title>
	Comments for chillibasket	</title>
	<atom:link href="https://wired.chillibasket.com/comments/feed/" rel="self" type="application/rss+xml" />
	<link>https://wired.chillibasket.com</link>
	<description>A Robotics and Technology Blog</description>
	<lastBuildDate>Wed, 04 Feb 2026 08:17:29 +0000</lastBuildDate>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	
	<item>
		<title>
		Comment on 3D Printed WALL·E by RCJOSEB		</title>
		<link>https://wired.chillibasket.com/3d-printed-wall-e/#comment-1471</link>

		<dc:creator><![CDATA[RCJOSEB]]></dc:creator>
		<pubDate>Wed, 04 Feb 2026 08:17:29 +0000</pubDate>
		<guid isPermaLink="false">https://wired.chillibasket.com/?page_id=917#comment-1471</guid>

					<description><![CDATA[In reply to &lt;a href=&quot;https://wired.chillibasket.com/3d-printed-wall-e/#comment-1460&quot;&gt;Odre&lt;/a&gt;.

I had the same issue but using an ESP32 with an I2C servo controller.  The I2C servo controller has two power inputs, one to power the board itself and the other to power the servos.  To power the board itself, I am using a 5V buck which is powered by the main battery.  To power the servos, I am using a Henge 8A switching UBEC which is also powered by the main battery.  All of the ground wires are connected to the main battery ground using a terminal block.]]></description>
			<content:encoded><![CDATA[<p>In reply to <a href="https://wired.chillibasket.com/3d-printed-wall-e/#comment-1460">Odre</a>.</p>
<p>I had the same issue but using an ESP32 with an I2C servo controller.  The I2C servo controller has two power inputs, one to power the board itself and the other to power the servos.  To power the board itself, I am using a 5V buck which is powered by the main battery.  To power the servos, I am using a Henge 8A switching UBEC which is also powered by the main battery.  All of the ground wires are connected to the main battery ground using a terminal block.</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		Comment on 3D printed Wall-E by Daniel		</title>
		<link>https://wired.chillibasket.com/2018/09/wall-e/#comment-1467</link>

		<dc:creator><![CDATA[Daniel]]></dc:creator>
		<pubDate>Mon, 26 Jan 2026 00:47:48 +0000</pubDate>
		<guid isPermaLink="false">http://wired.chillibasket.com/?p=735#comment-1467</guid>

					<description><![CDATA[Wow! How do you design this tread? It breaks easily at first, but when I reduced the number of plates it fits perfect.]]></description>
			<content:encoded><![CDATA[<p>Wow! How do you design this tread? It breaks easily at first, but when I reduced the number of plates it fits perfect.</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		Comment on 3D Printed WALL·E by John Doyle		</title>
		<link>https://wired.chillibasket.com/3d-printed-wall-e/#comment-1462</link>

		<dc:creator><![CDATA[John Doyle]]></dc:creator>
		<pubDate>Wed, 07 Jan 2026 18:17:49 +0000</pubDate>
		<guid isPermaLink="false">https://wired.chillibasket.com/?page_id=917#comment-1462</guid>

					<description><![CDATA[I have completed printing and fully assembling Wall E. The electronics are assembled and working but I am not happy with the servos I installed. Please tell me the servos you are using and.hopefully I can find some with similar specs. Thanks for your help and a fun project. My grandsons are great help.]]></description>
			<content:encoded><![CDATA[<p>I have completed printing and fully assembling Wall E. The electronics are assembled and working but I am not happy with the servos I installed. Please tell me the servos you are using and.hopefully I can find some with similar specs. Thanks for your help and a fun project. My grandsons are great help.</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		Comment on 3D Printed WALL·E by Odre		</title>
		<link>https://wired.chillibasket.com/3d-printed-wall-e/#comment-1460</link>

		<dc:creator><![CDATA[Odre]]></dc:creator>
		<pubDate>Sun, 28 Dec 2025 18:38:45 +0000</pubDate>
		<guid isPermaLink="false">https://wired.chillibasket.com/?page_id=917#comment-1460</guid>

					<description><![CDATA[Trying my luck here again. I have my wall-e assembled and running but I have an issue of servos twitching/shaking when I move motors slowly (motors alone work fine, servos alone as well, but when I try to move motors after using servos - servos start shaking uncontrollably. Pushing motors to full speed ends it but there&#039;s still something wrong and I can&#039;t  use both servos and motors at the same time.
From what I&#039;ve investigated, likely cause in noise caused by motors and to address that I&#039;ve replaced buck converter with UBEC and added ceramic and electrolytic capacitors to servo shield.
I&#039;ve also read that GND wire from servo shield to arduino has to be connected directly (not through motor shield) which was not clear from the wiring diagram. I also checked, all my servos are 180°
With all these changes I still have the issue.
Any ideas on why it&#039;s happening? I&#039;m new to electronics so maybe there&#039;s some obvious conept that I&#039;m missing?]]></description>
			<content:encoded><![CDATA[<p>Trying my luck here again. I have my wall-e assembled and running but I have an issue of servos twitching/shaking when I move motors slowly (motors alone work fine, servos alone as well, but when I try to move motors after using servos &#8211; servos start shaking uncontrollably. Pushing motors to full speed ends it but there&#8217;s still something wrong and I can&#8217;t  use both servos and motors at the same time.<br />
From what I&#8217;ve investigated, likely cause in noise caused by motors and to address that I&#8217;ve replaced buck converter with UBEC and added ceramic and electrolytic capacitors to servo shield.<br />
I&#8217;ve also read that GND wire from servo shield to arduino has to be connected directly (not through motor shield) which was not clear from the wiring diagram. I also checked, all my servos are 180°<br />
With all these changes I still have the issue.<br />
Any ideas on why it&#8217;s happening? I&#8217;m new to electronics so maybe there&#8217;s some obvious conept that I&#8217;m missing?</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		Comment on 3D Printed WALL·E by Jared		</title>
		<link>https://wired.chillibasket.com/3d-printed-wall-e/#comment-1459</link>

		<dc:creator><![CDATA[Jared]]></dc:creator>
		<pubDate>Wed, 24 Dec 2025 20:27:17 +0000</pubDate>
		<guid isPermaLink="false">https://wired.chillibasket.com/?page_id=917#comment-1459</guid>

					<description><![CDATA[In reply to &lt;a href=&quot;https://wired.chillibasket.com/3d-printed-wall-e/#comment-1453&quot;&gt;redclip&lt;/a&gt;.

Followed step by step. It appears to not have the right response. The web service shuts down and an invalid syntax error occurs when running the python3 ~/walle-replica/web_interface/app.py to check for errors. 

&lt;span&gt;&#160;File &lt;/span&gt;&quot;/home/walle-replica/web_interface/app.py&quot;&lt;span&gt;, line &lt;/span&gt;27
&#160;&#160;from flask import Flask, &lt;strong&gt;...&lt;/strong&gt;, Response&#160;# Add Response
&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&lt;strong&gt;^^^&lt;/strong&gt;
&lt;strong&gt;SyntaxError&lt;/strong&gt;&lt;span&gt;: &lt;/span&gt;invalid syntax]]></description>
			<content:encoded><![CDATA[<p>In reply to <a href="https://wired.chillibasket.com/3d-printed-wall-e/#comment-1453">redclip</a>.</p>
<p>Followed step by step. It appears to not have the right response. The web service shuts down and an invalid syntax error occurs when running the python3 ~/walle-replica/web_interface/app.py to check for errors. </p>
<p><span>&nbsp;File </span>&#8220;/home/walle-replica/web_interface/app.py&#8221;<span>, line </span>27<br />
&nbsp;&nbsp;from flask import Flask, <strong>&#8230;</strong>, Response&nbsp;# Add Response<br />
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;<strong>^^^</strong><br />
<strong>SyntaxError</strong><span>: </span>invalid syntax</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		Comment on 3D Printed WALL·E by Jared		</title>
		<link>https://wired.chillibasket.com/3d-printed-wall-e/#comment-1458</link>

		<dc:creator><![CDATA[Jared]]></dc:creator>
		<pubDate>Wed, 24 Dec 2025 18:14:35 +0000</pubDate>
		<guid isPermaLink="false">https://wired.chillibasket.com/?page_id=917#comment-1458</guid>

					<description><![CDATA[In reply to &lt;a href=&quot;https://wired.chillibasket.com/3d-printed-wall-e/#comment-1453&quot;&gt;redclip&lt;/a&gt;.

I have tried these changes and it did not work. It seemed to stop the web interface service and gives an internal server error message.]]></description>
			<content:encoded><![CDATA[<p>In reply to <a href="https://wired.chillibasket.com/3d-printed-wall-e/#comment-1453">redclip</a>.</p>
<p>I have tried these changes and it did not work. It seemed to stop the web interface service and gives an internal server error message.</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		Comment on 3D Printed WALL·E by redclip		</title>
		<link>https://wired.chillibasket.com/3d-printed-wall-e/#comment-1454</link>

		<dc:creator><![CDATA[redclip]]></dc:creator>
		<pubDate>Sat, 29 Nov 2025 11:00:02 +0000</pubDate>
		<guid isPermaLink="false">https://wired.chillibasket.com/?page_id=917#comment-1454</guid>

					<description><![CDATA[lets make wall-e smarter

 Guide: Implementing Face Tracking for WALL-EThis guide outlines how to enable Face Tracking on your WALL-E robot using OpenCV. This feature allows WALL-E to detect a human face and automatically rotate its head to follow the movement.
1. PrerequisitesEnsure that the OpenCV data files (Haar Cascades) are installed on your Raspberry Pi.
Bash

&lt;pre class=&quot;ql-syntax&quot; spellcheck=&quot;false&quot;&gt;sudo apt install opencv-data -y
&lt;/pre&gt;2. Locate Data FileFind the absolute path of the &lt;code&gt;haarcascade_frontalface_default.xml&lt;/code&gt; file, as the system might not locate it automatically.
Bash

&lt;pre class=&quot;ql-syntax&quot; spellcheck=&quot;false&quot;&gt;find /usr -name &quot;haarcascade_frontalface_default.xml&quot;
# Common path: /usr/share/opencv4/haarcascades/haarcascade_frontalface_default.xml
&lt;/pre&gt;3. Modify Backend (&lt;code&gt;app.py&lt;/code&gt;)Update the &lt;code&gt;USBCameraStreamer&lt;/code&gt; class in &lt;code&gt;web_interface/app.py&lt;/code&gt; to include the face detection and tracking logic.
A. Initialization (&lt;code&gt;__init__&lt;/code&gt;)Load the Haar Cascade model using the absolute path found in Step 2 and initialize the head position variable.
Python

&lt;pre class=&quot;ql-syntax&quot; spellcheck=&quot;false&quot;&gt;def __init__(self):
    self.camera = cv2.VideoCapture(0)
    # Load the face detection model using the absolute path
    self.face_cascade = cv2.CascadeClassifier(&#039;/usr/share/opencv4/haarcascades/haarcascade_frontalface_default.xml&#039;)
    self.is_running = False
    self.head_pos = 50  # Initial head position (Center)
&lt;/pre&gt;B. Tracking Logic (&lt;code&gt;get_frame&lt;/code&gt;)Implement the logic to detect faces and send servo commands to the Arduino.
Python

&lt;pre class=&quot;ql-syntax&quot; spellcheck=&quot;false&quot;&gt;def get_frame(self):
    if self.is_running:
        success, frame = self.camera.read()
        if success:
            # 1. Convert to grayscale for performance
            gray = cv2.cvtColor(frame, cv2.COLOR_BGR2GRAY)
            
            # 2. Detect faces
            faces = self.face_cascade.detectMultiScale(gray, 1.1, 4)

            for (x, y, w, h) in faces:
                # Draw a rectangle around the face
                cv2.rectangle(frame, (x, y), (x+w, y+h), (255, 0, 0), 2)
                
                # 3. Calculate the center of the face
                x_center = x + (w / 2)
                
                # 4. Control Motors (Center is 320, Deadzone is 40px)
                global arduino
                
                # If face is on the left -&#062; Rotate head left (Decrease angle)
                if x_center &#060; 280: 
                    if self.head_pos &#062; 0:
                        self.head_pos -= 1
                        arduino.send_command(&quot;G&quot; + str(self.head_pos))
                        
                # If face is on the right -&#062; Rotate head right (Increase angle)
                elif x_center &#062; 360: 
                    if self.head_pos &#060; 100:
                        self.head_pos += 1
                        arduino.send_command(&quot;G&quot; + str(self.head_pos))

            ret, buffer = cv2.imencode(&#039;.jpg&#039;, frame)
            return buffer.tobytes()
    return None
&lt;/pre&gt;4. Run &#038; TestRestart the WALL-E service to apply the changes.
Bash

&lt;pre class=&quot;ql-syntax&quot; spellcheck=&quot;false&quot;&gt;sudo systemd restart walle.service
sudo systemd status walle.service
&lt;/pre&gt;&lt;ul&gt;&lt;li&gt;&lt;strong&gt;Note:&lt;/strong&gt; Verify that the service status is &lt;code&gt;active (running)&lt;/code&gt;.&lt;/li&gt;&lt;/ul&gt;]]></description>
			<content:encoded><![CDATA[<p>lets make wall-e smarter</p>
<p> Guide: Implementing Face Tracking for WALL-EThis guide outlines how to enable Face Tracking on your WALL-E robot using OpenCV. This feature allows WALL-E to detect a human face and automatically rotate its head to follow the movement.<br />
1. PrerequisitesEnsure that the OpenCV data files (Haar Cascades) are installed on your Raspberry Pi.<br />
Bash</p>
<pre class="ql-syntax" spellcheck="false">sudo apt install opencv-data -y
</pre>
<p>2. Locate Data FileFind the absolute path of the <code>haarcascade_frontalface_default.xml</code> file, as the system might not locate it automatically.<br />
Bash</p>
<pre class="ql-syntax" spellcheck="false">find /usr -name "haarcascade_frontalface_default.xml"
# Common path: /usr/share/opencv4/haarcascades/haarcascade_frontalface_default.xml
</pre>
<p>3. Modify Backend (<code>app.py</code>)Update the <code>USBCameraStreamer</code> class in <code>web_interface/app.py</code> to include the face detection and tracking logic.<br />
A. Initialization (<code>__init__</code>)Load the Haar Cascade model using the absolute path found in Step 2 and initialize the head position variable.<br />
Python</p>
<pre class="ql-syntax" spellcheck="false">def __init__(self):
    self.camera = cv2.VideoCapture(0)
    # Load the face detection model using the absolute path
    self.face_cascade = cv2.CascadeClassifier('/usr/share/opencv4/haarcascades/haarcascade_frontalface_default.xml')
    self.is_running = False
    self.head_pos = 50  # Initial head position (Center)
</pre>
<p>B. Tracking Logic (<code>get_frame</code>)Implement the logic to detect faces and send servo commands to the Arduino.<br />
Python</p>
<pre class="ql-syntax" spellcheck="false">def get_frame(self):
    if self.is_running:
        success, frame = self.camera.read()
        if success:
            # 1. Convert to grayscale for performance
            gray = cv2.cvtColor(frame, cv2.COLOR_BGR2GRAY)
            
            # 2. Detect faces
            faces = self.face_cascade.detectMultiScale(gray, 1.1, 4)

            for (x, y, w, h) in faces:
                # Draw a rectangle around the face
                cv2.rectangle(frame, (x, y), (x+w, y+h), (255, 0, 0), 2)
                
                # 3. Calculate the center of the face
                x_center = x + (w / 2)
                
                # 4. Control Motors (Center is 320, Deadzone is 40px)
                global arduino
                
                # If face is on the left -&gt; Rotate head left (Decrease angle)
                if x_center &lt; 280: 
                    if self.head_pos &gt; 0:
                        self.head_pos -= 1
                        arduino.send_command("G" + str(self.head_pos))
                        
                # If face is on the right -&gt; Rotate head right (Increase angle)
                elif x_center &gt; 360: 
                    if self.head_pos &lt; 100:
                        self.head_pos += 1
                        arduino.send_command("G" + str(self.head_pos))

            ret, buffer = cv2.imencode('.jpg', frame)
            return buffer.tobytes()
    return None
</pre>
<p>4. Run &amp; TestRestart the WALL-E service to apply the changes.<br />
Bash</p>
<pre class="ql-syntax" spellcheck="false">sudo systemd restart walle.service
sudo systemd status walle.service
</pre>
<ul>
<li><strong>Note:</strong> Verify that the service status is <code>active (running)</code>.</li>
</ul>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		Comment on 3D Printed WALL·E by redclip		</title>
		<link>https://wired.chillibasket.com/3d-printed-wall-e/#comment-1453</link>

		<dc:creator><![CDATA[redclip]]></dc:creator>
		<pubDate>Sat, 29 Nov 2025 10:56:02 +0000</pubDate>
		<guid isPermaLink="false">https://wired.chillibasket.com/?page_id=917#comment-1453</guid>

					<description><![CDATA[Lets make wall-e works with usb camera!!! Just few simple changes.

&#xfe0f; Guide: Enabling USB Camera Support for WALL-E&#160;
This guide outlines the steps to replace the default PiCamera module with a USB Webcam using OpenCV in the WALL-E Flask web interface. ( by Gemini )
&#160;
1. Prerequisites&#160;
First, install the OpenCV library for Python on your Raspberry Pi.
Bash


&lt;pre class=&quot;ql-syntax&quot; spellcheck=&quot;false&quot;&gt;sudo apt update
sudo apt install python3-opencv
&lt;/pre&gt;&#160;
2. Modify Backend (&lt;code&gt;app.py&lt;/code&gt;)&#160;
We need to update the Python Flask server to capture frames from the USB camera.
&lt;strong&gt;A. Import OpenCV and Response&lt;/strong&gt;&#160;Add&#160;&lt;code&gt;cv2&lt;/code&gt;&#160;and&#160;&lt;code&gt;Response&lt;/code&gt;&#160;to the imports at the top of the file.
Python


&lt;pre class=&quot;ql-syntax&quot; spellcheck=&quot;false&quot;&gt;from flask import Flask, ..., Response  # Add Response
import cv2                              # Add cv2
&lt;/pre&gt;&lt;strong&gt;B. Add Camera Class&lt;/strong&gt;&#160;Insert the&#160;&lt;code&gt;USBCameraStreamer&lt;/code&gt;&#160;class before the&#160;&lt;code&gt;app = Flask(__name__)&lt;/code&gt;&#160;line.
Python


&lt;pre class=&quot;ql-syntax&quot; spellcheck=&quot;false&quot;&gt;class USBCameraStreamer:
    def __init__(self):
        self.camera = cv2.VideoCapture(0) # 0 is usually the default USB camera
        self.is_running = False

    def start_stream(self):
        if not self.camera.isOpened():
            self.camera.open(0)
        self.is_running = True
        return True, &quot;USB Camera Started&quot;

    def stop_stream(self):
        self.is_running = False
        if self.camera.isOpened():
            self.camera.release()
        return True

    def is_stream_active(self):
        return self.is_running

    def get_frame(self):
        if self.is_running:
            success, frame = self.camera.read()
            if success:
                ret, buffer = cv2.imencode(&#039;.jpg&#039;, frame)
                return buffer.tobytes()
        return None
&lt;/pre&gt;&lt;strong&gt;C. Initialize Camera&lt;/strong&gt;&#160;Replace the existing&#160;&lt;code&gt;PiCameraStreamer&lt;/code&gt;&#160;initialization with our new class (around line 34).
Python


&lt;pre class=&quot;ql-syntax&quot; spellcheck=&quot;false&quot;&gt;# camera: PiCameraStreamer = PiCameraStreamer()  &#060;-- Comment this out
camera = USBCameraStreamer()                     # &#060;-- Add this
&lt;/pre&gt;&lt;strong&gt;D. Add Video Route&lt;/strong&gt;&#160;Add the generator function and route at the bottom of the file (before&#160;&lt;code&gt;if __name__ == ...&lt;/code&gt;).
Python


&lt;pre class=&quot;ql-syntax&quot; spellcheck=&quot;false&quot;&gt;def gen(camera):
    while True:
        frame = camera.get_frame()
        if frame is not None:
            yield (b&#039;--frame\r\n&#039;
                   b&#039;Content-Type: image/jpeg\r\n\r\n&#039; + frame + b&#039;\r\n&#039;)
        else:
             time.sleep(0.1)

@app.route(&#039;/video_feed&#039;)
def video_feed():
    return Response(gen(camera),
                    mimetype=&#039;multipart/x-mixed-replace; boundary=frame&#039;)
&lt;/pre&gt;&#160;
3. Modify Frontend (&lt;code&gt;index.html&lt;/code&gt;)&#160;
Update the HTML to load the video from our new route.
&lt;ul&gt;&lt;li&gt;&lt;strong&gt;File:&lt;/strong&gt;&#160;&lt;code&gt;web_interface/templates/index.html&lt;/code&gt;&lt;/li&gt;&lt;li&gt;&lt;strong&gt;Action:&lt;/strong&gt;&#160;Find the&#160;&lt;code&gt;&#060;img&#062;&lt;/code&gt;&#160;tag with&#160;&lt;code&gt;id=&quot;stream&quot;&lt;/code&gt;&#160;and change the&#160;&lt;code&gt;src&lt;/code&gt;&#160;attribute.&lt;/li&gt;&lt;/ul&gt;HTML


&lt;pre class=&quot;ql-syntax&quot; spellcheck=&quot;false&quot;&gt;&#060;img id=&quot;stream&quot; src=&quot;{{ url_for(&#039;static&#039;, filename=&#039;streamimage.jpg&#039;) }}&quot;&#062;

&#060;img id=&quot;stream&quot; src=&quot;{{ url_for(&#039;video_feed&#039;) }}&quot;&#062;
&lt;/pre&gt;&#160;
4. Update Logic (&lt;code&gt;main.js&lt;/code&gt;)&#160;
Prevent the JavaScript from overwriting the camera URL with the old port 8080 address.
&lt;ul&gt;&lt;li&gt;&lt;strong&gt;File:&lt;/strong&gt;&#160;&lt;code&gt;web_interface/static/js/main.js&lt;/code&gt;&lt;/li&gt;&lt;li&gt;&lt;strong&gt;Action:&lt;/strong&gt;&#160;Search for&#160;&lt;code&gt;$(&quot;#stream&quot;).attr(&quot;src&quot;, ...)&lt;/code&gt;&#160;and replace the URL in&#160;&lt;strong&gt;two places&lt;/strong&gt;&#160;(inside&#160;&lt;code&gt;sendSettings&lt;/code&gt;&#160;function and&#160;&lt;code&gt;window.onload&lt;/code&gt;&#160;section).&lt;/li&gt;&lt;/ul&gt;JavaScript


&lt;pre class=&quot;ql-syntax&quot; spellcheck=&quot;false&quot;&gt;// Replace this logic in both locations:
$(&quot;#stream&quot;).attr(&quot;src&quot;, &quot;/video_feed&quot;);
&lt;/pre&gt;&#160;
5. Run&#160;
Restart the service to apply changes.
Bash


&lt;pre class=&quot;ql-syntax&quot; spellcheck=&quot;false&quot;&gt;sudo systemd stop walle.service
python3 app.py
&lt;/pre&gt;]]></description>
			<content:encoded><![CDATA[<p>Lets make wall-e works with usb camera!!! Just few simple changes.</p>
<p>️ Guide: Enabling USB Camera Support for WALL-E&nbsp;<br />
This guide outlines the steps to replace the default PiCamera module with a USB Webcam using OpenCV in the WALL-E Flask web interface. ( by Gemini )<br />
&nbsp;<br />
1. Prerequisites&nbsp;<br />
First, install the OpenCV library for Python on your Raspberry Pi.<br />
Bash</p>
<pre class="ql-syntax" spellcheck="false">sudo apt update
sudo apt install python3-opencv
</pre>
<p>&nbsp;<br />
2. Modify Backend (<code>app.py</code>)&nbsp;<br />
We need to update the Python Flask server to capture frames from the USB camera.<br />
<strong>A. Import OpenCV and Response</strong>&nbsp;Add&nbsp;<code>cv2</code>&nbsp;and&nbsp;<code>Response</code>&nbsp;to the imports at the top of the file.<br />
Python</p>
<pre class="ql-syntax" spellcheck="false">from flask import Flask, ..., Response  # Add Response
import cv2                              # Add cv2
</pre>
<p><strong>B. Add Camera Class</strong>&nbsp;Insert the&nbsp;<code>USBCameraStreamer</code>&nbsp;class before the&nbsp;<code>app = Flask(__name__)</code>&nbsp;line.<br />
Python</p>
<pre class="ql-syntax" spellcheck="false">class USBCameraStreamer:
    def __init__(self):
        self.camera = cv2.VideoCapture(0) # 0 is usually the default USB camera
        self.is_running = False

    def start_stream(self):
        if not self.camera.isOpened():
            self.camera.open(0)
        self.is_running = True
        return True, "USB Camera Started"

    def stop_stream(self):
        self.is_running = False
        if self.camera.isOpened():
            self.camera.release()
        return True

    def is_stream_active(self):
        return self.is_running

    def get_frame(self):
        if self.is_running:
            success, frame = self.camera.read()
            if success:
                ret, buffer = cv2.imencode('.jpg', frame)
                return buffer.tobytes()
        return None
</pre>
<p><strong>C. Initialize Camera</strong>&nbsp;Replace the existing&nbsp;<code>PiCameraStreamer</code>&nbsp;initialization with our new class (around line 34).<br />
Python</p>
<pre class="ql-syntax" spellcheck="false"># camera: PiCameraStreamer = PiCameraStreamer()  &lt;-- Comment this out
camera = USBCameraStreamer()                     # &lt;-- Add this
</pre>
<p><strong>D. Add Video Route</strong>&nbsp;Add the generator function and route at the bottom of the file (before&nbsp;<code>if __name__ == ...</code>).<br />
Python</p>
<pre class="ql-syntax" spellcheck="false">def gen(camera):
    while True:
        frame = camera.get_frame()
        if frame is not None:
            yield (b'--frame\r\n'
                   b'Content-Type: image/jpeg\r\n\r\n' + frame + b'\r\n')
        else:
             time.sleep(0.1)

@app.route('/video_feed')
def video_feed():
    return Response(gen(camera),
                    mimetype='multipart/x-mixed-replace; boundary=frame')
</pre>
<p>&nbsp;<br />
3. Modify Frontend (<code>index.html</code>)&nbsp;<br />
Update the HTML to load the video from our new route.</p>
<ul>
<li><strong>File:</strong>&nbsp;<code>web_interface/templates/index.html</code></li>
<li><strong>Action:</strong>&nbsp;Find the&nbsp;<code>&lt;img&gt;</code>&nbsp;tag with&nbsp;<code>id=&quot;stream&quot;</code>&nbsp;and change the&nbsp;<code>src</code>&nbsp;attribute.</li>
</ul>
<p>HTML</p>
<pre class="ql-syntax" spellcheck="false">&lt;img id="stream" src="{{ url_for('static', filename='streamimage.jpg') }}"&gt;

&lt;img id="stream" src="{{ url_for('video_feed') }}"&gt;
</pre>
<p>&nbsp;<br />
4. Update Logic (<code>main.js</code>)&nbsp;<br />
Prevent the JavaScript from overwriting the camera URL with the old port 8080 address.</p>
<ul>
<li><strong>File:</strong>&nbsp;<code>web_interface/static/js/main.js</code></li>
<li><strong>Action:</strong>&nbsp;Search for&nbsp;<code>$(&quot;#stream&quot;).attr(&quot;src&quot;, ...)</code>&nbsp;and replace the URL in&nbsp;<strong>two places</strong>&nbsp;(inside&nbsp;<code>sendSettings</code>&nbsp;function and&nbsp;<code>window.onload</code>&nbsp;section).</li>
</ul>
<p>JavaScript</p>
<pre class="ql-syntax" spellcheck="false">// Replace this logic in both locations:
$("#stream").attr("src", "/video_feed");
</pre>
<p>&nbsp;<br />
5. Run&nbsp;<br />
Restart the service to apply changes.<br />
Bash</p>
<pre class="ql-syntax" spellcheck="false">sudo systemd stop walle.service
python3 app.py
</pre>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		Comment on 3D Printed WALL·E by NGUYEN		</title>
		<link>https://wired.chillibasket.com/3d-printed-wall-e/#comment-1452</link>

		<dc:creator><![CDATA[NGUYEN]]></dc:creator>
		<pubDate>Wed, 26 Nov 2025 08:32:42 +0000</pubDate>
		<guid isPermaLink="false">https://wired.chillibasket.com/?page_id=917#comment-1452</guid>

					<description><![CDATA[Hello,
I recently assembled your Wall-E robot equipped with a PiCamera Rev 1.3 connected via CSI.
 The camera works, but the captured image appears highly zoomed in and does not have the correct proportions on the screen.
I would like to know if a specific configuration is required in order to get a properly sized image (resolution settings, zoom configuration, or PiCamera parameters to adjust).
Could you please advise which settings you recommend to obtain a non-distorted image without excessive zoom?
 Is there any fix or software update that may solve this issue?
Thank you in advance for your help!
Kind regards,]]></description>
			<content:encoded><![CDATA[<p>Hello,<br />
I recently assembled your Wall-E robot equipped with a PiCamera Rev 1.3 connected via CSI.<br />
 The camera works, but the captured image appears highly zoomed in and does not have the correct proportions on the screen.<br />
I would like to know if a specific configuration is required in order to get a properly sized image (resolution settings, zoom configuration, or PiCamera parameters to adjust).<br />
Could you please advise which settings you recommend to obtain a non-distorted image without excessive zoom?<br />
 Is there any fix or software update that may solve this issue?<br />
Thank you in advance for your help!<br />
Kind regards,</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		Comment on 3D Printed WALL·E by Nora		</title>
		<link>https://wired.chillibasket.com/3d-printed-wall-e/#comment-1450</link>

		<dc:creator><![CDATA[Nora]]></dc:creator>
		<pubDate>Fri, 14 Nov 2025 01:38:28 +0000</pubDate>
		<guid isPermaLink="false">https://wired.chillibasket.com/?page_id=917#comment-1450</guid>

					<description><![CDATA[Hi, just wondering if anyone has any advice on or a video of the assembly process for the electronic components as I want to make sure I do it correctly.]]></description>
			<content:encoded><![CDATA[<p>Hi, just wondering if anyone has any advice on or a video of the assembly process for the electronic components as I want to make sure I do it correctly.</p>
]]></content:encoded>
		
			</item>
	</channel>
</rss>
