Skip to content

Add artificial_neural_network.py in neural_network #11858

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 13 commits into from
Closed
Show file tree
Hide file tree
Changes from 5 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 6 additions & 0 deletions DIRECTORY.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,7 @@
* [Rat In Maze](backtracking/rat_in_maze.py)
* [Sudoku](backtracking/sudoku.py)
* [Sum Of Subsets](backtracking/sum_of_subsets.py)
* [Word Break](backtracking/word_break.py)
* [Word Ladder](backtracking/word_ladder.py)
* [Word Search](backtracking/word_search.py)

Expand Down Expand Up @@ -99,6 +100,7 @@
* [Elgamal Key Generator](ciphers/elgamal_key_generator.py)
* [Enigma Machine2](ciphers/enigma_machine2.py)
* [Fractionated Morse Cipher](ciphers/fractionated_morse_cipher.py)
* [Gronsfeld Cipher](ciphers/gronsfeld_cipher.py)
* [Hill Cipher](ciphers/hill_cipher.py)
* [Mixed Keyword Cypher](ciphers/mixed_keyword_cypher.py)
* [Mono Alphabetic Ciphers](ciphers/mono_alphabetic_ciphers.py)
Expand Down Expand Up @@ -211,6 +213,7 @@
* [Lazy Segment Tree](data_structures/binary_tree/lazy_segment_tree.py)
* [Lowest Common Ancestor](data_structures/binary_tree/lowest_common_ancestor.py)
* [Maximum Fenwick Tree](data_structures/binary_tree/maximum_fenwick_tree.py)
* [Maximum Sum Bst](data_structures/binary_tree/maximum_sum_bst.py)
* [Merge Two Binary Trees](data_structures/binary_tree/merge_two_binary_trees.py)
* [Mirror Binary Tree](data_structures/binary_tree/mirror_binary_tree.py)
* [Non Recursive Segment Tree](data_structures/binary_tree/non_recursive_segment_tree.py)
Expand Down Expand Up @@ -284,6 +287,7 @@
* [Dijkstras Two Stack Algorithm](data_structures/stacks/dijkstras_two_stack_algorithm.py)
* [Infix To Postfix Conversion](data_structures/stacks/infix_to_postfix_conversion.py)
* [Infix To Prefix Conversion](data_structures/stacks/infix_to_prefix_conversion.py)
* [Lexicographical Numbers](data_structures/stacks/lexicographical_numbers.py)
* [Next Greater Element](data_structures/stacks/next_greater_element.py)
* [Postfix Evaluation](data_structures/stacks/postfix_evaluation.py)
* [Prefix Evaluation](data_structures/stacks/prefix_evaluation.py)
Expand Down Expand Up @@ -820,6 +824,7 @@
* [Softplus](neural_network/activation_functions/softplus.py)
* [Squareplus](neural_network/activation_functions/squareplus.py)
* [Swish](neural_network/activation_functions/swish.py)
* [Artificial Neural Network](neural_network/artificial_neural_network.py)
* [Back Propagation Neural Network](neural_network/back_propagation_neural_network.py)
* [Convolution Neural Network](neural_network/convolution_neural_network.py)
* [Input Data](neural_network/input_data.py)
Expand Down Expand Up @@ -1201,6 +1206,7 @@
* [Binary Tree Traversal](searches/binary_tree_traversal.py)
* [Double Linear Search](searches/double_linear_search.py)
* [Double Linear Search Recursion](searches/double_linear_search_recursion.py)
* [Exponential Search](searches/exponential_search.py)
* [Fibonacci Search](searches/fibonacci_search.py)
* [Hill Climbing](searches/hill_climbing.py)
* [Interpolation Search](searches/interpolation_search.py)
Expand Down
96 changes: 96 additions & 0 deletions neural_network/artificial_neural_network.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,96 @@
"""
Simple Artificial Neural Network (ANN)
- Feedforward Neural Network with 1 hidden layer and Sigmoid activation.
- Uses Gradient Descent for backpropagation and Mean Squared Error (MSE)
as the loss function.
- Example demonstrates solving the XOR problem.
"""

import numpy as np


class ANN:
"""
Artificial Neural Network (ANN)

- Feedforward Neural Network with 1 hidden layer
and Sigmoid activation.
- Uses Gradient Descent for backpropagation.
- Example demonstrates solving the XOR problem.
"""

def __init__(self, input_size, hidden_size, output_size, learning_rate=0.1):
# Initialize weights using np.random.Generator
rng = np.random.default_rng()
self.weights_input_hidden = rng.standard_normal((input_size, hidden_size))
self.weights_hidden_output = rng.standard_normal((hidden_size, output_size))

# Initialize biases
self.bias_hidden = np.zeros((1, hidden_size))
self.bias_output = np.zeros((1, output_size))

# Learning rate
self.learning_rate = learning_rate

def sigmoid(self, x):
"""Sigmoid activation function."""
return 1 / (1 + np.exp(-x))

def sigmoid_derivative(self, x):

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As there is no test file in this pull request nor any test function or class in the file neural_network/artificial_neural_network.py, please provide doctest for the function sigmoid_derivative

Please provide return type hint for the function: sigmoid_derivative. If the function does not return a value, please provide the type hint as: def function() -> None:

Please provide descriptive name for the parameter: x

Please provide type hint for the parameter: x

"""Derivative of the sigmoid function."""
return x * (1 - x)

def feedforward(self, x):

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As there is no test file in this pull request nor any test function or class in the file neural_network/artificial_neural_network.py, please provide doctest for the function feedforward

Please provide return type hint for the function: feedforward. If the function does not return a value, please provide the type hint as: def function() -> None:

Please provide descriptive name for the parameter: x

Please provide type hint for the parameter: x

"""Forward pass."""
self.hidden_input = np.dot(x, self.weights_input_hidden) + self.bias_hidden
self.hidden_output = self.sigmoid(self.hidden_input)
self.final_input = (
np.dot(self.hidden_output, self.weights_hidden_output) + self.bias_output
)
self.final_output = self.sigmoid(self.final_input)
return self.final_output

def backpropagation(self, x, y, output):

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As there is no test file in this pull request nor any test function or class in the file neural_network/artificial_neural_network.py, please provide doctest for the function backpropagation

Please provide return type hint for the function: backpropagation. If the function does not return a value, please provide the type hint as: def function() -> None:

Please provide descriptive name for the parameter: x

Please provide type hint for the parameter: x

Please provide descriptive name for the parameter: y

Please provide type hint for the parameter: y

Please provide type hint for the parameter: output

"""Backpropagation to adjust weights."""
error = y - output
output_gradient = error * self.sigmoid_derivative(output)
hidden_error = output_gradient.dot(self.weights_hidden_output.T)
hidden_gradient = hidden_error * self.sigmoid_derivative(self.hidden_output)

self.weights_hidden_output += (
self.hidden_output.T.dot(output_gradient) * self.learning_rate
)
self.bias_output += (
np.sum(output_gradient, axis=0, keepdims=True) * self.learning_rate
)

self.weights_input_hidden += x.T.dot(hidden_gradient) * self.learning_rate
self.bias_hidden += (
np.sum(hidden_gradient, axis=0, keepdims=True) * self.learning_rate
)

def train(self, x, y, epochs=10000):
"""Train the network."""
for epoch in range(epochs):
output = self.feedforward(x)
self.backpropagation(x, y, output)
if epoch % 1000 == 0:
loss = np.mean(np.square(y - output))
print(f"Epoch {epoch}, Loss: {loss}")

def predict(self, x):
"""Make predictions."""
return self.feedforward(x)


if __name__ == "__main__":
X = np.array([[0, 0], [0, 1], [1, 0], [1, 1]])
y = np.array([[0], [1], [1], [0]])
# Initialize the neural network
ann = ANN(input_size=2, hidden_size=2, output_size=1, learning_rate=0.1)
# Train the neural network
ann.train(X, y, epochs=100)
# Predict
predictions = ann.predict(X)
print("Predictions:")
print(predictions)