AI Basic Data Structures

AI Basic Data Structures

·

3 min read

Before jumping directly into cool AI programming, I want to show you some related mathematics concepts. AI is based on maths. AI is using Maths to mimic the real world and solve real problems.

AI relies heavily on mathematical concepts to process data, make predictions, and optimize models. Linear algebra, calculus, probability, and statistics form the backbone of AI algorithms.

Today, we will focus on scalars, vectors, and matrices, which are fundamental to representing and manipulating data in AI.

Scalar

A single number (e.g., temperature, age).
Example: 5, 3.14, -10.

Vector

An ordered list of numbers (e.g., coordinates, feature vectors in ML).

Example: [1, 2, 3], [0.5, -1.2, 4.7].

Matrice

A 2D array of numbers (e.g., image pixels, dataset tables).

Example:

[[1, 2, 3], [4, 5, 6], [7, 8, 9]]

Tensor

A multi-dimensional array of numerical values. It generalizes the concepts of scalars, vectors, and matrices to higher dimensions. Tensors are commonly used in deep learning frameworks such as Tensorflow and Pytorch to represent and manipulate data efficiently.

Rank (or Order) of a tensor: The number of dimensions in the tensor.

Examples:

  • A scalar is a tensor of rank 0 (0-dimensional).

  • A vector is a tensor of rank 1 (1-dimensional).

  • A matrice is a tensor of rank 2(2-dimensional).

  • Higher-dimensional tensors (rank > 2) are used for more complex data structures, such as 3D images, or multi-channel data. e.g. [[[1, 2, 3], [4, 5, 6]], [[7, 8, 9], [10, 11, 12]]]

What are the differences and relationships between them?

  • Scalars are 0-dimensional, vectors are 1-dimensional, and matrices are 2-dimensional.

  • Vectors can be regarded as a special case of matrices (with one row or column).

  • Tensors generalize the concepts of scalars, vectors, and matrices to higher dimensions.

Python Code: Creating Scalars, Vectors, and Matrices

import numpy as np

# Scalar
scalar = 5
print("Scalar:", scalar)

# Vector
vector = np.array([1, 2, 3])
print("Vector:\n", vector)

# Matrix
matrix = np.array([[1, 2, 3], [4, 5, 6], [7, 8, 9]])
print("Matrix:\n", matrix)

# Rank-3 Tensor (3D Array)
tensor_3d = np.array([[[1, 2], [3, 4]], [[5, 6], [7, 8]]])
print("Rank-3 Tensor (3D Array):\n", tensor_3d)
print("Shape of Rank-3 Tensor:", tensor_3d.shape, "\n")
Key Operations
  1. Addition and Subtraction:

    • Element-wise operations for vectors and matrices.
  2. Multiplication:

    • Scalar multiplication: Multiply every element by a scalar.

    • Dot product: Sum of the products of corresponding elements.

    • Matrix multiplication: Row-by-column multiplication.

Python Code: Perform Operations on the Maths Objectives


# Vector addition
vector1 = np.array([1, 2, 3])
vector2 = np.array([4, 5, 6])
vector_sum = vector1 + vector2
print("Vector Addition:\n", vector_sum)

# Scalar multiplication
scalar_mult = 2 * vector1
print("Scalar Multiplication:\n", scalar_mult)

# Dot product
dot_product = np.dot(vector1, vector2)
print("Dot Product:\n", dot_product)

# Matrix multiplication
matrix1 = np.array([[1, 2], [3, 4]])
matrix2 = np.array([[5, 6], [7, 8]])
matrix_product = np.matmul(matrix1, matrix2)
print("Matrix multiplication:\n", matrix_product)

# Tensor Operations
# Element-wise addition
tensor1 = np.array([[[1, 2], [3, 4]], [[5, 6], [7, 8]]])
tensor2 = np.array([[[9, 10], [11, 12]], [[13, 14], [15, 16]]])
tensor_sum = tensor1 + tensor2
print("Element-wise Tensor Addition:\n", tensor_sum)

# Tensor reshaping
reshaped_tensor = tensor_3d.reshape(2, 4)
print("Reshaped Tensor (2x4):\n", reshaped_tensor)Basic Operations in Linear Algebra

Challenge: Can you get the output of the above operations?

Examples of applications in AI

  • Scalars: Used for hyperparameters (e.g., learning rate).

  • Vectors: Represent data points (e.g., feature vectors in ML).

  • Matrices: Represent datasets, transformations, and weights in neural networks. (e.g.In image processing, a grayscale image is represented as a matrix of pixel intensities.)

Summary and Next Steps

  • Recap:

    • Scalars are single numbers, vectors are 1D arrays, and matrices are 2D arrays. And tensors are higher-dimensional arrays.

    • Linear algebra operations like addition, multiplication, and dot products are fundamental to AI.

  • Next Steps:
    In the next blog, I will dive deeper into calculus for AI, exploring derivatives, gradients, and their role in optimization. Please let me know if you have any questions on this article. Thanks for your time learning, reading and sharing :)