[Winter Blog Challenge] tensorboard μ¬μ©λ²
μλ νμΈμπGDGoC member μ΄κ°ν μ λλ€.
μ λ λ₯λ¬λ νλ‘μ νΈμμ νμ΅ κ³Όμ μ μ§κ΄μ μΌλ‘ λͺ¨λν°λ§ν μ μλ λꡬ, TensorBoardμ νμ©λ²μ λν΄ μκ°ν΄λλ¦¬λ € ν©λλ€.
Tensorboardλ₯Ό λ¨Όμ μκ° λλ¦¬κ³ , λ₯λ¬λμμ μ¬μ©νλ μμ, λ¨Έμ λ¬λμμ μ¬μ©ν μ μλ μμλ₯Ό μκ°ν΄λ리λλ‘ νκ² μ΅λλ€ !
TensorBoard λ?
Tensorflowκ° μ 곡νλ νμ΅ κ³Όμ μκ°ν λꡬλ‘, νμ΅ μ€μ λ°μνλ Loss, Accuracy, Gradient λ±μ λ€μν λ‘κ·Έλ₯Ό
μ€μκ°μΌλ‘ κ·Έλν ννλ‘ νμΈν μ μλλ‘ λ³΄μ¬μ€λλ€.
νμ΅ μ§νλ€μ κ·Έλνλ‘ νμΈ νλ©΄μ, νμ΅μ΄ μ λλ‘ μ΄λ£¨μ΄μ§κ³ μλμ§ νμΈν μ μκ³ ,
νμ΄νΌνλΌλ―Έν° νλ ν, μ΄μ κ²°κ³Όμ λΉκ΅ν¨μΌλ‘μ¨ ν¨κ³Όμ μΈ μ€μ μ‘°ν©μ νμνλλ° λμμ μ€λλ€.
λ, λͺ¨λΈ ꡬ쑰 λ° λ΄λΆ μ 보λ₯Ό λΆμν μ μμ΄, νλΌλ―Έν°μ λΆν¬μ λ³ν, gradient λ³νλ₯Ό μΆμ ν μ μμ΅λλ€.
λν μ°μ° κ·Έλνλ₯Ό μκ°ν ν¨μΌλ‘μ¨ λͺ¨λΈ ꡬ쑰λ₯Ό νμ ν μλ μμ΅λλ€.
νΉν, μμ¦ μ΄λ―Έμ§, μμ±, ν μ€νΈ λ± κ³ μ°¨μ λ°μ΄ν°λ₯Ό μ΄μ©ν λͺ¨λΈλ€μ΄ λ€μνκ² κ°λ° λκ³ μλλ°μ.
Tensorboardμμ κ³ μ°¨μ λ°μ΄ν° λν κ³ μ°¨μ μλ² λ©μ ν΅ν΄ λ°μ΄ν° κ° κ΄κ³λ₯Ό μκ°ν ν μ μκ³ ,
μ΄λ₯Ό μ΄μ©ν μμΈ‘ κ²°κ³Όλ₯Ό κΈ°λ‘ν μλ μμ΅λλ€.


PyTorchμ TensorBoard μ°λ μ€μ΅ - λ₯λ¬λ
μ λ κ°λ¨νκ² titanic λ°μ΄ν°μ μ λΆλ¬μ¨ λ€μ κ°λ¨ν λͺ¨λΈμ μ μνμ¬, μ€μ΅ μ§ννμ΅λλ€.
λ°μ΄ν° μ
X = df.drop("Survived", axis=1).values
y = df["Survived"].values
X = StandardScaler().fit_transform(X)
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)
class TitanicDataset(Dataset):
def __init__(self, X, y):
self.X = torch.tensor(X, dtype=torch.float32)
self.y = torch.tensor(y, dtype=torch.long)
def __len__(self): return len(self.X)
def __getitem__(self, idx): return self.X[idx], self.y[idx]
train_loader = DataLoader(TitanicDataset(X_train, y_train), batch_size=32, shuffle=True)
test_loader = DataLoader(TitanicDataset(X_test, y_test), batch_size=32)
λͺ¨λΈ μ μ
class TitanicNet(nn.Module):
def __init__(self):
super().__init__()
self.net = nn.Sequential(
nn.Linear(6, 16),
nn.ReLU(),
nn.Linear(16, 2)
)
def forward(self, x): return self.net(x)
model = TitanicNet().to(device)
SummaryWriter μ€μ
: summaryWriter μμ λ‘κ·Έκ° μ μ₯λ λλ ν 리 κ²½λ‘λ₯Ό μ§μ νμ¬, writerμ μ€μ νλ©΄ λ©λλ€.
writer = SummaryWriter("runs/nn_titanic")
λ‘κ·Έ κΈ°λ‘
: νμ΅ μ½λ λ΄λΆμ writer.add_scalar()λ₯Ό μΆκ°νμ¬, μνλ λ‘κ·Έλ₯Ό κΈ°λ‘νλλ‘ μ€μ ν©λλ€.
criterion = nn.CrossEntropyLoss()
optimizer = torch.optim.Adam(model.parameters(), lr=0.001)
for epoch in range(100):
model.train()
total_loss = 0
correct = 0
for X_batch, y_batch in train_loader:
X_batch, y_batch = X_batch.to(device), y_batch.to(device)
output = model(X_batch)
loss = criterion(output, y_batch)
optimizer.zero_grad()
loss.backward()
optimizer.step()
total_loss += loss.item()
pred = output.argmax(1)
correct += (pred == y_batch).sum().item()
acc = correct / len(train_loader.dataset)
# writerλ‘ λ‘κ·Έ κΈ°λ‘
writer.add_scalar("Loss/train", total_loss / len(train_loader), epoch)
writer.add_scalar("Accuracy/train", acc, epoch)
Writer close
: λ‘κ·Έ μ°λκ²μ μ’ λ£νκΈ° μν΄ νμ΅μ΄ λλ ν, μλμ μ½λλ₯Ό μ€νν©λλ€.
writer.close()
Tensorboard μ€ν
: λ‘κ·Έλ₯Ό νμΈν μ μλλ‘ μλμ μ½λλ₯Ό μ€ννλ©΄ λ©λλ€!
%load_ext tensorboard
%tensorboard --logdir=runs
μ€ννλ©΄, μ΄λ°μμΌλ‘ νμΈν μ μμ΅λλ€!

LGBMκ³Ό TensorBoard μ°λ μ€μ΅ - λ¨Έμ λ¬λ
μ΄λ² μ€μ΅μμλ titanic λ°μ΄ν°μ μ λΆλ¬μ¨ λ€μ λ¨Έμ λ¬λμ LightCBM λͺ¨λΈμ μ¬μ©νμ¬ μμΈ‘ν μ μλλ‘ μ§ννμμ΅λλ€.
λ¨Έμ λ¬λ λͺ¨λΈμμλ νμ΅ λ‘κ·Έ κΈ°λ‘μ΄ μ μ λ‘ λμ΄κ°μ§ μμ§λ§ λ‘κ·Έ κΈ°λ‘μ ꡬννμ¬ λμΌνκ² μκ°νν μ μμ΅λλ€!
λ°μ΄ν° μ
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)
train_data = lgb.Dataset(X_train, label=y_train)
valid_data = lgb.Dataset(X_test, label=y_test)
SummaryWriter μ€μ
: summaryWriter μμ λ‘κ·Έκ° μ μ₯λ λλ ν 리 κ²½λ‘λ₯Ό μ§μ νμ¬, writerμ μ€μ νλ©΄ λ©λλ€.
writer = SummaryWriter("runs/nn_titanic")
λ‘κ·Έ κΈ°λ‘μ μν μ½λ°± ν¨μ μ μ
def tb_callback(env):
epoch = env.iteration
train_loss = env.evaluation_result_list[0][2]
val_loss = env.evaluation_result_list[1][2]
writer.add_scalar("Loss/train", train_loss, epoch)
writer.add_scalar("Loss/valid", val_loss, epoch)
λͺ¨λΈ μ μ
: λͺ¨λΈ train μ½λ μμ μ½λ°± ν¨μλ₯Ό λ£μ΄, λ‘κ·Έ κΈ°λ‘μ κ°λ₯ν λ‘ ν©λλ€.
## λͺ¨λΈ νλΌλ―Έν°
params = {
"objective": "binary",
"metric": "binary_logloss",
"learning_rate": 0.1,
"verbosity": -1
}
model = lgb.train(
params,
train_data,
num_boost_round=100,
valid_sets=[train_data, valid_data],
valid_names=["train", "valid"],
## μ½λ°±
callbacks=[
tb_callback,
lgb.log_evaluation(period=1)
]
)
predict λ‘κ·Έ κΈ°λ‘
: μμΈ‘ν κ²°κ³Όλ‘ λμ¨ λ‘κ·Έλ₯Ό κΈ°λ‘νκ³ μ ν λ, model.predict ν, κΈ°λ‘νλλ‘ νλ©΄, test κ²°κ³Όλ νμΈ κ°λ₯ν©λλ€!
y_pred = model.predict(X_test)
acc = accuracy_score(y_test, (y_pred >= 0.5).astype(int))
writer.add_scalar("Accuracy/test", acc, 0)
Writer close
: λ‘κ·Έ μ°λκ²μ μ’ λ£νκΈ° μν΄ νμ΅μ΄ λλ ν, μλμ μ½λλ₯Ό μ€νν©λλ€.
writer.close()
Tensorboard μ€ν
: λ‘κ·Έλ₯Ό νμΈν μ μλλ‘ μλμ μ½λλ₯Ό μ€ννλ©΄ λ©λλ€!
%load_ext tensorboard
%tensorboard --logdir=runs
μ€ννλ©΄, μ΄λ°μμΌλ‘ νμΈν μ μμ΅λλ€!

Tensorboardλ₯Ό μ μ΄μ©νλ€λ©΄ λͺ¨λΈμ μ±λ₯μ λΉ λ₯΄κ² κ°μ νκ³ ,
μ€νμ λ°λ³΅νλ©΄μ μ νν λ°©ν₯μΌλ‘ λμκ° μ μκ² λμμ μ£Όλ μ’μ λꡬμ λλ€!
λ¨Έμ λ¬λ λͺ¨λΈμ ꡬννκ±°λ λ₯λ¬λ λͺ¨λΈμ ꡬνν λ, νμ©νμ¬ μ’μ κ²°κ³Όλ₯Ό λ§λ€μ΄λ³΄μΈμππΆ
κ°μ¬ν©λλ€π