25秋深度学习训练营-第2周:卷积神经网络

第2周:卷积神经网络 学习内容:https://oucai.club/classes/dl/week01#第2周-卷积神经网络 学习视频:https://www.jianguoyun.com/p/Dde3HS8QrKKIBhi2xpEGIAA 视频学习至1小时06分,包括: CNN的基本结构:卷积、池化、全连接 代码练习 实验1: 使用 LeNet 对 MNIST 数据集分类 代码见https://oucai.club/classes/dl/week02 Epoch [1/10], Step [100/938], Loss: 0.9421, Accuracy: 70.53% Epoch [1/10], Step [200/938], Loss: 0.2978, Accuracy: 80.96% Epoch [1/10], Step [300/938], Loss: 0.1998, Accuracy: 85.28% Epoch [1/10], Step [400/938], Loss: 0.1483, Accuracy: 87.91% Epoch [1/10], Step [500/938], Loss: 0.1318, Accuracy: 89.54% Epoch [1/10], Step [600/938], Loss: 0.1123, Accuracy: 90.72% Epoch [1/10], Step [700/938], Loss: 0.0962, Accuracy: 91.64% Epoch [1/10], Step [800/938], Loss: 0.1050, Accuracy: 92.31% Epoch [1/10], Step [900/938], Loss: 0.0876, Accuracy: 92.84% Test Loss: 0.0854, Test Accuracy: 97.36% Epoch [2/10], Step [100/938], Loss: 0.0724, Accuracy: 97.58% Epoch [2/10], Step [200/938], Loss: 0.0759, Accuracy: 97.56% Epoch [2/10], Step [300/938], Loss: 0.0701, Accuracy: 97.63% Epoch [2/10], Step [400/938], Loss: 0.0674, Accuracy: 97.72% Epoch [2/10], Step [500/938], Loss: 0.0710, Accuracy: 97.73% Epoch [2/10], Step [600/938], Loss: 0.0669, Accuracy: 97.77% Epoch [2/10], Step [700/938], Loss: 0.0625, Accuracy: 97.82% Epoch [2/10], Step [800/938], Loss: 0.0561, Accuracy: 97.88% Epoch [2/10], Step [900/938], Loss: 0.0608, Accuracy: 97.91% Test Loss: 0.0484, Test Accuracy: 98.39% Epoch [3/10], Step [100/938], Loss: 0.0442, Accuracy: 98.55% Epoch [3/10], Step [200/938], Loss: 0.0591, Accuracy: 98.39% Epoch [3/10], Step [300/938], Loss: 0.0406, Accuracy: 98.51% Epoch [3/10], Step [400/938], Loss: 0.0455, Accuracy: 98.55% Epoch [3/10], Step [500/938], Loss: 0.0498, Accuracy: 98.51% Epoch [3/10], Step [600/938], Loss: 0.0424, Accuracy: 98.51% Epoch [3/10], Step [700/938], Loss: 0.0452, Accuracy: 98.52% Epoch [3/10], Step [800/938], Loss: 0.0542, Accuracy: 98.49% Epoch [3/10], Step [900/938], Loss: 0.0410, Accuracy: 98.50% Test Loss: 0.0415, Test Accuracy: 98.62% Epoch [4/10], Step [100/938], Loss: 0.0353, Accuracy: 98.89% Epoch [4/10], Step [200/938], Loss: 0.0337, Accuracy: 98.92% Epoch [4/10], Step [300/938], Loss: 0.0398, Accuracy: 98.90% Epoch [4/10], Step [400/938], Loss: 0.0357, Accuracy: 98.89% Epoch [4/10], Step [500/938], Loss: 0.0416, Accuracy: 98.84% Epoch [4/10], Step [600/938], Loss: 0.0442, Accuracy: 98.80% Epoch [4/10], Step [700/938], Loss: 0.0336, Accuracy: 98.80% Epoch [4/10], Step [800/938], Loss: 0.0319, Accuracy: 98.85% Epoch [4/10], Step [900/938], Loss: 0.0400, Accuracy: 98.84% Test Loss: 0.0440, Test Accuracy: 98.60% Epoch [5/10], Step [100/938], Loss: 0.0287, Accuracy: 99.17% Epoch [5/10], Step [200/938], Loss: 0.0251, Accuracy: 99.23% Epoch [5/10], Step [300/938], Loss: 0.0291, Accuracy: 99.18% Epoch [5/10], Step [400/938], Loss: 0.0275, Accuracy: 99.18% Epoch [5/10], Step [500/938], Loss: 0.0351, Accuracy: 99.09% Epoch [5/10], Step [600/938], Loss: 0.0325, Accuracy: 99.07% Epoch [5/10], Step [700/938], Loss: 0.0320, Accuracy: 99.05% Epoch [5/10], Step [800/938], Loss: 0.0337, Accuracy: 99.03% Epoch [5/10], Step [900/938], Loss: 0.0245, Accuracy: 99.06% Test Loss: 0.0468, Test Accuracy: 98.52% Epoch [6/10], Step [100/938], Loss: 0.0277, Accuracy: 99.05% Epoch [6/10], Step [200/938], Loss: 0.0258, Accuracy: 99.11% Epoch [6/10], Step [300/938], Loss: 0.0264, Accuracy: 99.14% Epoch [6/10], Step [400/938], Loss: 0.0229, Accuracy: 99.18% Epoch [6/10], Step [500/938], Loss: 0.0194, Accuracy: 99.23% Epoch [6/10], Step [600/938], Loss: 0.0277, Accuracy: 99.19% Epoch [6/10], Step [700/938], Loss: 0.0233, Accuracy: 99.20% Epoch [6/10], Step [800/938], Loss: 0.0238, Accuracy: 99.20% Epoch [6/10], Step [900/938], Loss: 0.0288, Accuracy: 99.18% Test Loss: 0.0554, Test Accuracy: 98.28% Epoch [7/10], Step [100/938], Loss: 0.0193, Accuracy: 99.28% Epoch [7/10], Step [200/938], Loss: 0.0249, Accuracy: 99.28% Epoch [7/10], Step [300/938], Loss: 0.0211, Accuracy: 99.33% Epoch [7/10], Step [400/938], Loss: 0.0196, Accuracy: 99.37% Epoch [7/10], Step [500/938], Loss: 0.0257, Accuracy: 99.30% Epoch [7/10], Step [600/938], Loss: 0.0226, Accuracy: 99.30% Epoch [7/10], Step [700/938], Loss: 0.0212, Accuracy: 99.30% Epoch [7/10], Step [800/938], Loss: 0.0267, Accuracy: 99.29% Epoch [7/10], Step [900/938], Loss: 0.0181, Accuracy: 99.30% Test Loss: 0.0441, Test Accuracy: 98.65% Epoch [8/10], Step [100/938], Loss: 0.0145, Accuracy: 99.50% Epoch [8/10], Step [200/938], Loss: 0.0164, Accuracy: 99.48% Epoch [8/10], Step [300/938], Loss: 0.0179, Accuracy: 99.46% Epoch [8/10], Step [400/938], Loss: 0.0161, Accuracy: 99.44% Epoch [8/10], Step [500/938], Loss: 0.0165, Accuracy: 99.44% Epoch [8/10], Step [600/938], Loss: 0.0177, Accuracy: 99.45% Epoch [8/10], Step [700/938], Loss: 0.0211, Accuracy: 99.43% Epoch [8/10], Step [800/938], Loss: 0.0210, Accuracy: 99.41% Epoch [8/10], Step [900/938], Loss: 0.0194, Accuracy: 99.41% Test Loss: 0.0316, Test Accuracy: 99.08% Epoch [9/10], Step [100/938], Loss: 0.0100, Accuracy: 99.69% Epoch [9/10], Step [200/938], Loss: 0.0136, Accuracy: 99.63% Epoch [9/10], Step [300/938], Loss: 0.0176, Accuracy: 99.56% Epoch [9/10], Step [400/938], Loss: 0.0210, Accuracy: 99.50% Epoch [9/10], Step [500/938], Loss: 0.0144, Accuracy: 99.52% Epoch [9/10], Step [600/938], Loss: 0.0167, Accuracy: 99.50% Epoch [9/10], Step [700/938], Loss: 0.0228, Accuracy: 99.46% Epoch [9/10], Step [800/938], Loss: 0.0197, Accuracy: 99.46% Epoch [9/10], Step [900/938], Loss: 0.0136, Accuracy: 99.46% Test Loss: 0.0403, Test Accuracy: 98.86% Epoch [10/10], Step [100/938], Loss: 0.0097, Accuracy: 99.66% Epoch [10/10], Step [200/938], Loss: 0.0141, Accuracy: 99.59% Epoch [10/10], Step [300/938], Loss: 0.0089, Accuracy: 99.63% Epoch [10/10], Step [400/938], Loss: 0.0187, Accuracy: 99.59% Epoch [10/10], Step [500/938], Loss: 0.0182, Accuracy: 99.56% Epoch [10/10], Step [600/938], Loss: 0.0150, Accuracy: 99.55% Epoch [10/10], Step [700/938], Loss: 0.0113, Accuracy: 99.55% Epoch [10/10], Step [800/938], Loss: 0.0200, Accuracy: 99.51% Epoch [10/10], Step [900/938], Loss: 0.0195, Accuracy: 99.50% Test Loss: 0.0391, Test Accuracy: 98.93% Best Test Accuracy: 99.08% 经过多轮训练尝试,我们发现第十次的准确度并非最高,反而是第九次的准确度达到了99%以上 ...

December 12, 2025 · 9 min · 1874 words · Me

25秋深度学习训练营-第1周:深度学习基础

第1周:深度学习基础 学习内容:https://oucai.club/classes/dl/week01#第1周-深度学习基础 学习视频:https://www.jianguoyun.com/p/Dde3HS8QrKKIBhi2xpEGIAA 学习要求: 深度学习的入门知识 pytorch 基础练习,螺旋数据分类代码练习 1、视频学习 学习视频:深度学习基础,主要内容包括: 浅层神经⽹络:⽣物神经元到单层感知器,多层感知器,反向传播和梯度消失 神经⽹络到深度学习:逐层预训练,⾃编码器和受限玻尔兹曼机 2、代码练习 可以在Google Colaboratory开发Deep Learning Applications,它自带免费的Tesla K80 GPU。 1. PyTorch 基础练习 什么是 PyTorch ? PyTorch是一个python库,它主要提供了两个高级功能: GPU加速的张量计算 构建在反向自动求导系统上的深度神经网络 1. 定义数据 Tensor支持各种各样类型的数据,包括: torch.float32, torch.float64, torch.float16, torch.uint8, torch.int8, torch.int16, torch.int32, torch.int64 2. 定义操作 凡是用Tensor进行各种运算的,都是Function 最终,还是需要用Tensor来进行计算的,计算无非是 基本运算,加减乘除,求幂求余 布尔运算,大于小于,最大最小 线性运算,矩阵乘法,求模,求行列式 基本运算包括: abs/sqrt/div/exp/fmod/pow ,及一些三角函数 cos/ sin/ asin/ atan2/ cosh,及 ceil/round/floor/trunc 等 布尔运算包括: gt/lt/ge/le/eq/ne,topk, sort, max/min 线性计算包括: trace, diag, mm/bmm,t,dot/cross,inverse,svd 等 在对m与v进行矩阵乘法时,报错如下 原因在于在前文代码中,m的dtype为float,而v的dtype为Long 因此,将代码m @ v改为m @ v.float(),把矩阵v的dtype转化为float进行乘法操作 ...

December 2, 2025 · 3 min · 455 words · Me

Quote

A good life is one inspired by love and guided by knowledge. 好的生活,由爱激扬、由知识引导。 Bertrand Russell A rose by any other name would smell as sweet.

November 28, 2025 · 1 min · 25 words · Me

计导试验报告6

实验 6 排序算法 计算机类3班 施家鑫 25020007105 Content 实验 6 排序算法 Content C语言实现 生成100000个随机数 实现冒泡排序功能 实现快速排序功能 分别记录两者执行时间,并进行比较 完成主程序 Python语言实现 生成100000个随机数到数组 冒泡排序功能 快速排序功能 记录执行时间并进行比较 主程序 关于本文档的撰写 C语言实现 随机生成 100000个随机数,进行冒泡排序和快速排序,并比较执行时间。 生成100000个随机数 引用头文件 #include <stdlib.h> // 使用其中的rand()函数 #include <time.h> // 记录执行时间 // 使用其中的time()函数为随机数做种子,以保证“随机” 使用time()为随机数做种 srand(time(NULL)); 生成100000个随机数 int a[100001] = {0}; for (int i = 0; i < 100000; i++) { a[i] = rand() % 1000000; } 实现冒泡排序功能 void bubble_sort(int a[], int n) { int i, j, temp; for (i = 0; i < n-1; i++) { for (j = 0; j < n-i-1; j++) { if (a[j] > a[j+1]) { temp = a[j]; a[j] = a[j+1]; a[j+1] = temp; } } } } 实现快速排序功能 void quick_sort(int a[], int low, int high) { if (low < high) { int pivot = a[high]; int i = low - 1; for (int j = low; j < high; j++) { if (a[j] < pivot) { i++; int temp = a[i]; a[i] = a[j]; a[j] = temp; } } int temp = a[i + 1]; a[i + 1] = a[high]; a[high] = temp; int pi = i + 1; quick_sort(a, low, pi - 1); quick_sort(a, pi + 1, high); } } 分别记录两者执行时间,并进行比较 使用两个变量分别记录开始与结束时间 clock_t start, end; 在排序功能执行的代码前后分别加上 start = clock(); end = clock(); 记录对应的时间 ...

November 28, 2025 · 3 min · 512 words · Me