Skip to main content

Overview

The kinematics experiment provides a comprehensive study of 1D motion. You can measure position vs. time and calculate velocity and acceleration using either camera-based tracking or distance sensors (ultrasonic or ToF laser).

Physics Theory

Fundamental Kinematic Quantities

Position: x(t)x(t) - Location as a function of time Velocity: v(t)=dxdtv(t) = \frac{dx}{dt} - Rate of change of position Acceleration: a(t)=dvdt=d2xdt2a(t) = \frac{dv}{dt} = \frac{d^2x}{dt^2} - Rate of change of velocity

Types of Motion

Uniform Motion (constant velocity): x(t)=x0+vtx(t) = x_0 + vt a=0a = 0 Uniformly Accelerated Motion: x(t)=x0+v0t+12at2x(t) = x_0 + v_0 t + \frac{1}{2}at^2 v(t)=v0+atv(t) = v_0 + at General Motion:
  • Position data collected experimentally
  • Velocity from numerical differentiation: vi=xi+1xi12Δtv_i = \frac{x_{i+1} - x_{i-1}}{2\Delta t}
  • Acceleration from second derivative: ai=vi+1vi12Δta_i = \frac{v_{i+1} - v_{i-1}}{2\Delta t}

Measurement Methods

Video-Based Position Tracking

Track a colored object moving in 1D (horizontal or vertical) using OpenCV.Source: ~/workspace/source/Kinemactic/kinematicCam/analisis.py

Hardware Requirements

  • Camera (webcam, USB camera, phone)
  • Colored marker on moving object
  • Ruler or calibration markers (known distance)
  • Track or path for object motion

Setup Procedure

1

Setup Physical Track

  • Create horizontal or vertical track
  • Attach colored marker to moving object
  • Place calibration markers at known distances
  • Ensure motion is approximately 1D
2

Record Video

  • Position camera perpendicular to motion
  • Ensure full range of motion visible
  • Record at consistent frame rate
  • Adequate lighting
3

Run Analysis

cd ~/workspace/source/Kinemactic/kinematicCam
python analisis.py
4

Configure Analysis

Frame Selection:
  • Navigate: d (next), a (previous)
  • Mark start: i
  • Mark end: f
Color Calibration:
  • Select ROI around moving object
  • Automatic HSV range detection
Spatial Calibration:
  • Click on two points with known separation
  • Enter distance in meters
  • System calculates pixel-to-meter scale

Key Code Sections

Color Detection (analisis.py:85-99):
# Select ROI for color calibration
roi = cv2.selectROI("Seleccionar objeto", frame, False)
x, y, w, h = roi
objeto = frame[y:y+h, x:x+w]
hsv_objeto = cv2.cvtColor(objeto, cv2.COLOR_BGR2HSV)

h_mean = np.mean(hsv_objeto[:,:,0])
hsv_lower = np.array([h_mean-15, 50, 50])
hsv_upper = np.array([h_mean+15, 255, 255])
Pixel-to-Meter Calibration (analisis.py:107-135):
# User clicks two points with known distance
puntos = []

def click(event, x, y, flags, param):
    if event == cv2.EVENT_LBUTTONDOWN:
        puntos.append((x,y))

cv2.setMouseCallback("Calibracion", click)

# Wait for 2 clicks
while len(puntos) < 2:
    cv2.waitKey(1)

dist_px = np.linalg.norm(np.array(puntos[0]) - np.array(puntos[1]))
escala = distancia_real_m / dist_px  # m/pixel
Object Tracking (analisis.py:145-180):
while frame_num <= frame_fin:
    ret, frame = cap.read()
    if not ret:
        break
    
    hsv = cv2.cvtColor(frame, cv2.COLOR_BGR2HSV)
    mask = cv2.inRange(hsv, hsv_lower, hsv_upper)
    mask = cv2.morphologyEx(mask, cv2.MORPH_OPEN, np.ones((5,5),np.uint8))
    
    contornos, _ = cv2.findContours(mask, cv2.RETR_EXTERNAL, 
                                     cv2.CHAIN_APPROX_SIMPLE)
    
    if contornos:
        c = max(contornos, key=cv2.contourArea)
        M = cv2.moments(c)
        
        if M["m00"] != 0:
            cx = int(M["m10"]/M["m00"])
            cy = int(M["m01"]/M["m00"])
            
            x_m = cx * escala
            y_m = cy * escala
            tiempo = (frame_num - frame_inicio) / fps
            
            datos.append((tiempo, x_m, y_m))

Output

  • posicion_vs_tiempo.txt: Time and position data
  • Automatic plot of position vs. time
  • Polynomial fit for acceleration estimation

Data Analysis

Velocity from Position Data

Numerical differentiation with smoothing:
import numpy as np
import matplotlib.pyplot as plt

# Load data
data = np.loadtxt('posicion_vs_tiempo.txt')
t = data[:, 0]
x = data[:, 1]  # or calculate distance: np.linalg.norm(data[:,1:3], axis=1)

# Smoothing function
def smooth(arr, window=5):
    kernel = np.ones(window) / window
    return np.convolve(arr, kernel, mode='same')

# Smooth position
x_smooth = smooth(x, window=5)

# Calculate velocity (central difference)
v = np.gradient(x_smooth, t)
v_smooth = smooth(v, window=5)

# Calculate acceleration
a = np.gradient(v_smooth, t)
a_smooth = smooth(a, window=5)

Plotting Results

fig, axs = plt.subplots(3, 1, figsize=(10, 10), sharex=True)

# Position
axs[0].plot(t, x, 'o', alpha=0.3, label='Raw')
axs[0].plot(t, x_smooth, '-', lw=2, label='Smoothed')
axs[0].set_ylabel('Position (m)')
axs[0].legend()
axs[0].grid()

# Velocity
axs[1].plot(t, v_smooth, '-', lw=2, color='orange')
axs[1].axhline(0, color='gray', linestyle='--')
axs[1].set_ylabel('Velocity (m/s)')
axs[1].grid()

# Acceleration
axs[2].plot(t, a_smooth, '-', lw=2, color='red')
axs[2].axhline(0, color='gray', linestyle='--')
axs[2].set_ylabel('Acceleration (m/s²)')
axs[2].set_xlabel('Time (s)')
axs[2].grid()

plt.tight_layout()
plt.savefig('kinematics_analysis.png', dpi=150)
plt.show()

Motion Classification

Automatically determine motion type:
# Calculate mean absolute values
mean_v = np.mean(np.abs(v_smooth))
mean_a = np.mean(np.abs(a_smooth))

# Thresholds (adjust based on noise)
v_threshold = 0.01  # m/s
a_threshold = 0.1   # m/s²

if mean_v < v_threshold:
    motion_type = "Stationary"
elif mean_a < a_threshold:
    motion_type = "Uniform (constant velocity)"
else:
    motion_type = "Accelerated"

print(f"Motion type: {motion_type}")
print(f"Mean velocity: {mean_v:.4f} m/s")
print(f"Mean acceleration: {mean_a:.4f} m/s²")

Curve Fitting

Linear (uniform motion):
from scipy.optimize import curve_fit

def linear(t, x0, v):
    return x0 + v * t

popt, pcov = curve_fit(linear, t, x_smooth)
x0_fit, v_fit = popt
perr = np.sqrt(np.diag(pcov))

print(f"Initial position: {x0_fit:.4f} ± {perr[0]:.4f} m")
print(f"Velocity: {v_fit:.4f} ± {perr[1]:.4f} m/s")
Quadratic (constant acceleration):
def quadratic(t, x0, v0, a):
    return x0 + v0 * t + 0.5 * a * t**2

popt, pcov = curve_fit(quadratic, t, x_smooth)
x0_fit, v0_fit, a_fit = popt
perr = np.sqrt(np.diag(pcov))

print(f"Initial position: {x0_fit:.4f} ± {perr[0]:.4f} m")
print(f"Initial velocity: {v0_fit:.4f} ± {perr[1]:.4f} m/s")
print(f"Acceleration: {a_fit:.4f} ± {perr[2]:.4f} m/s²")

Experimental Scenarios

1. Constant Velocity Cart

Setup: Cart on air track or low-friction surface, gentle push Expected:
  • Linear x(t)
  • Constant v
  • a ≈ 0

2. Accelerated Cart

Setup: Cart on inclined plane or with hanging mass Expected:
  • Quadratic x(t)
  • Linear v(t)
  • Constant a

3. Oscillating Mass

Setup: Mass on spring or pendulum (viewed from side) Expected:
  • Sinusoidal x(t)
  • Cosine v(t) (90° phase shift)
  • Negative sine a(t)

4. Free Fall (Vertical)

Setup: Object dropped, camera viewing from side Expected:
  • Quadratic x(t)
  • Linear v(t) with slope g
  • a ≈ -9.8 m/s²

Tips for Best Results

  • Mount camera very stably (tripod essential)
  • Use high-contrast colored marker
  • Ensure marker always visible (no occlusion)
  • Adequate frame rate (30+ FPS)
  • Minimize parallax (camera perpendicular)
  • Measure calibration distance accurately
  • Mount sensor rigidly
  • Align perpendicular to motion
  • Use flat, non-reflective target
  • Shield from ambient IR light
  • Choose appropriate filter for your motion
  • Calibrate zero position
  • Apply appropriate smoothing (balance noise vs. response)
  • Use central difference for derivatives (more accurate)
  • Check for systematic errors (offset, scaling)
  • Estimate uncertainties from repeated trials
  • Plot raw and processed data together

Troubleshooting

IssueCamera MethodSensor Method
Noisy positionBetter lighting, less blurIncrease filter strength
Erratic velocityMore smoothing, higher FPSLower cutoff frequency
Lost trackingBrighter marker, adjust HSVCheck alignment, target
Scale errorsRe-measure calibration distanceVerify sensor calibration
Time sync issuesVerify FPS, check frame dropsCheck timer configuration

Advanced Analysis

Uncertainty Quantification

# Repeat experiment N times
N_trials = 5
all_data = []  # Load N datasets

# Calculate mean and std at each time point
t_common = ...  # Common time base
x_mean = np.mean(all_data, axis=0)
x_std = np.std(all_data, axis=0)

# Plot with error bars
plt.errorbar(t_common, x_mean, yerr=x_std, fmt='o-', capsize=3)
plt.fill_between(t_common, x_mean-x_std, x_mean+x_std, alpha=0.3)

Frequency Analysis

For periodic motion:
from scipy.fft import fft, fftfreq

# FFT of position signal
X = fft(x_smooth - np.mean(x_smooth))
freqs = fftfreq(len(t), t[1] - t[0])

# Plot power spectrum
plt.figure()
plt.plot(freqs[:len(freqs)//2], np.abs(X[:len(X)//2]))
plt.xlabel('Frequency (Hz)')
plt.ylabel('Magnitude')
plt.title('Frequency Spectrum')

Next Steps

Experiments Overview

Explore other physics experiments

Data Analysis Guide

Advanced filtering and analysis techniques

Build docs developers (and LLMs) love