一、概述
ResNet(殘差神經網絡)於 2015 年提出,是 ImageNet 圖像分類任務上的冠軍。其核心思想是通過引入跨層連接(shortcut connection)解決了深層神經網絡訓練過程中梯度消失和梯度爆炸的問題。ResNet18 是 ResNet 的一個較小版本,共有 18 層,包括 16 層卷積神經網絡層和 2 層全連接層。
二、ResNet18 細節
1.卷積層
ResNet18 的卷積層共包含16層,在 3×3 的卷積核後,採用 ReLU 激活函數,步長為1,不使用池化層。其中,前7層卷積沒有跨層連接,8-16 層使用了跨層連接。具體來說,8-16 層中每隔一個殘差塊就有一條跨層連接,用於將上一層特徵圖與下一層殘差塊的輸出相加。
2.殘差塊
ResNet18 的每個殘差塊包含 2-3 個卷積層,每個卷積層都跟隨着 Batch Normalization 層和 ReLU 激活函數。具體來說,第一個卷積層的卷積核大小為 3 × 3,第二個卷積層的卷積核大小也為 3 × 3。如果跨層連接存在,還需要添加一層 1 × 1 的卷積層用於調整維度。
3.全連接層
ResNet18 的最後兩層全連接層都含有 512 個神經元,倒數第二層使用 ReLU 激活函數,而最後一層使用 Softmax 函數產生對類的預測輸出。
三、代碼示例
1.定義 ResNet18 結構
import torch.nn as nn class BasicBlock(nn.Module): expansion = 1 def __init__(self, in_planes, planes, stride=1): super(BasicBlock, self).__init__() self.conv1 = nn.Conv2d(in_planes, planes, kernel_size=3, stride=stride, padding=1, bias=False) self.bn1 = nn.BatchNorm2d(planes) self.conv2 = nn.Conv2d(planes, planes, kernel_size=3, stride=1, padding=1, bias=False) self.bn2 = nn.BatchNorm2d(planes) self.shortcut = nn.Sequential() if stride != 1 or in_planes != self.expansion*planes: self.shortcut = nn.Sequential( nn.Conv2d(in_planes, self.expansion*planes, kernel_size=1, stride=stride, bias=False), nn.BatchNorm2d(self.expansion*planes) ) def forward(self, x): out = nn.functional.relu(self.bn1(self.conv1(x))) out = self.bn2(self.conv2(out)) out += self.shortcut(x) out = nn.functional.relu(out) return out class ResNet18(nn.Module): def __init__(self, num_classes=10): super(ResNet18, self).__init__() self.in_planes = 64 self.conv1 = nn.Conv2d(3, 64, kernel_size=3, stride=1, padding=1, bias=False) self.bn1 = nn.BatchNorm2d(64) self.layer1 = self._make_layer(64, 2, stride=1) self.layer2 = self._make_layer(128, 2, stride=2) self.layer3 = self._make_layer(256, 2, stride=2) self.layer4 = self._make_layer(512, 2, stride=2) self.linear = nn.Linear(512*BasicBlock.expansion, num_classes) def _make_layer(self, planes, num_blocks, stride): strides = [stride] + [1]*(num_blocks-1) layers = [] for stride in strides: layers.append(BasicBlock(self.in_planes, planes, stride)) self.in_planes = planes * BasicBlock.expansion return nn.Sequential(*layers) def forward(self, x): out = nn.functional.relu(self.bn1(self.conv1(x))) out = self.layer1(out) out = self.layer2(out) out = self.layer3(out) out = self.layer4(out) out = nn.functional.avg_pool2d(out, 4) out = out.view(out.size(0), -1) out = self.linear(out) return out
2.實例化 ResNet18 模型
resnet18 = ResNet18()
3.模型訓練
criterion = nn.CrossEntropyLoss() optimizer = torch.optim.Adam(resnet18.parameters()) for epoch in range(num_epochs): running_loss = 0.0 for i, data in enumerate(trainloader, 0): inputs, labels = data optimizer.zero_grad() outputs = resnet18(inputs) loss = criterion(outputs, labels) loss.backward() optimizer.step() running_loss += loss.item() if i % 2000 == 1999: print('[%d, %5d] loss: %.3f' % (epoch + 1, i + 1, running_loss / 2000)) running_loss = 0.0
原創文章,作者:QWYJ,如若轉載,請註明出處:https://www.506064.com/zh-hant/n/135397.html