num_epochs, lr = 5, 0.1
# 本函数已保存在d2lzh包中方便以后使用
def train_ch3(net, train_iter, test_iter, loss, num_epochs, batch_size,
params=None, lr=None, trainer=None):
for epoch in range(num_epochs):
train_l_sum, train_acc_sum, n = 0.0, 0.0, 0
for X, y in train_iter:
with autograd.record():
y_hat = net(X)
l = loss(y_hat, y).sum()
l.backward()
if trainer is None:
d2l.sgd(params, lr, batch_size)
else:
trainer.step(batch_size) # “softmax回归的简洁实现”一节将用到
y = y.astype('float32')
train_l_sum += l.asscalar()
train_acc_sum += (y_hat.argmax(axis=1) == y).sum().asscalar()
n += y.size
test_acc = evaluate_accuracy(test_iter, net)
print('epoch %d, loss %.4f, train acc %.3f, test acc %.3f'
% (epoch + 1, train_l_sum / n, train_acc_sum / n, test_acc))
train_ch3(net, train_iter, test_iter, cross_entropy, num_epochs, batch_size,
[W, b], lr)
The chunk above is a defined function written in Python. And there are two comments in Chinese, # 本函数已保存在d2lzh包中方便以后使用
and # “softmax回归的简洁实现”一节将用到
.
When I run it, I get this error.
Warning in strsplit(code, "\n", fixed = TRUE) :
input string 1 is invalid UTF-8
NameError: name 'NA' is not defined
Detailed traceback:
File "<string>", line 1, in <module>
But when I remove the Chinese chrs, it works.
How to fix this encoding problem?