Neural Network Structure
Copy this line into your web page:
1 | |
Run the Neural Network MicroSim Fullscreen
Edit the Neural Network MicroSim Using the p5.js Editor
About This MicroSim
This interactive simulation demonstrates the structure of a fully connected (dense) neural network and helps you understand how to count the number of parameters in a model. Parameters include:
- Weights: The connections between neurons in adjacent layers
- Biases: One bias term for each neuron in hidden and output layers
How to Use
- Layers Slider: Adjust the number of layers in the network (2-7 layers including input and output)
- Neurons/Layer Slider: Change the number of neurons per layer (2-10 neurons)
- Watch the Total Parameters count update as you modify the network
Parameter Calculation
For a fully connected network with uniform layer sizes:
- Weights = neurons × neurons × (layers - 1)
- Biases = neurons × (layers - 1)
- Total Parameters = weights + biases
For example, with 3 layers and 4 neurons per layer:
- Weights = 4 × 4 × 2 = 32
- Biases = 4 × 2 = 8
- Total = 40 parameters
Note
There are no weights or biases associated with the Input layer. These nodes are only placed in the diagram to illustrate the concept of a input layer.
Lesson Plan
Learning Objectives
- Understand the basic structure of a feedforward neural network
- Identify the components: input layer, hidden layers, output layer
- Calculate the number of parameters in a neural network
- Explain the relationship between network size and parameter count
Target Audience
- High school students (grades 10-12)
- College introductory AI/ML courses
- Self-learners exploring neural network fundamentals
Prerequisites
- Basic algebra (multiplication, addition)
- Understanding of functions and inputs/outputs
- Familiarity with graphs and network diagrams
Activities
- Exploration: Start with the default settings and note the parameter count
- Prediction: Before moving sliders, predict how parameter count will change
- Pattern Discovery: Find the mathematical relationship between layers, neurons, and parameters
- Real-World Connection: Discuss how modern networks like GPT have billions of parameters
Discussion Questions
- Why do larger networks have more parameters?
- What are the trade-offs of having more parameters?
- How does this simple model compare to real neural networks?
References
- 3Blue1Brown - Neural Networks - Visual explanations of neural network concepts
- Neural Network Playground - TensorFlow's interactive neural network visualization
- Deep Learning Book - Goodfellow et al. - Comprehensive deep learning textbook