๋™์•„๋ฆฌ,ํ•™ํšŒ/GDGoC

[Winter Blog Challenge] tensorboard ์‚ฌ์šฉ๋ฒ•

egahyun 2025. 6. 12. 19:26

์•ˆ๋…•ํ•˜์„ธ์š”๐Ÿ˜ŠGDGoC member ์ด๊ฐ€ํ˜„ ์ž…๋‹ˆ๋‹ค.

 

์ €๋Š” ๋”ฅ๋Ÿฌ๋‹ ํ”„๋กœ์ ํŠธ์—์„œ ํ•™์Šต ๊ณผ์ •์„ ์ง๊ด€์ ์œผ๋กœ ๋ชจ๋‹ˆํ„ฐ๋งํ•  ์ˆ˜ ์žˆ๋Š” ๋„๊ตฌ, TensorBoard์˜ ํ™œ์šฉ๋ฒ•์— ๋Œ€ํ•ด ์†Œ๊ฐœํ•ด๋“œ๋ฆฌ๋ ค ํ•ฉ๋‹ˆ๋‹ค.

Tensorboard๋ฅผ ๋จผ์ € ์†Œ๊ฐœ ๋“œ๋ฆฌ๊ณ , ๋”ฅ๋Ÿฌ๋‹์—์„œ ์‚ฌ์šฉํ•˜๋Š” ์˜ˆ์‹œ, ๋จธ์‹ ๋Ÿฌ๋‹์—์„œ ์‚ฌ์šฉํ•  ์ˆ˜ ์žˆ๋Š” ์˜ˆ์‹œ๋ฅผ ์†Œ๊ฐœํ•ด๋“œ๋ฆฌ๋„๋ก ํ•˜๊ฒ ์Šต๋‹ˆ๋‹ค !


TensorBoard ๋ž€?

Tensorflow๊ฐ€ ์ œ๊ณตํ•˜๋Š” ํ•™์Šต ๊ณผ์ • ์‹œ๊ฐํ™” ๋„๊ตฌ๋กœ, ํ•™์Šต ์ค‘์— ๋ฐœ์ƒํ•˜๋Š” Loss, Accuracy, Gradient ๋“ฑ์˜ ๋‹ค์–‘ํ•œ ๋กœ๊ทธ๋ฅผ

์‹ค์‹œ๊ฐ„์œผ๋กœ ๊ทธ๋ž˜ํ”„ ํ˜•ํƒœ๋กœ ํ™•์ธํ•  ์ˆ˜ ์žˆ๋„๋ก ๋ณด์—ฌ์ค๋‹ˆ๋‹ค.

 

ํ•™์Šต ์ง€ํ‘œ๋“ค์„ ๊ทธ๋ž˜ํ”„๋กœ ํ™•์ธ ํ•˜๋ฉด์„œ, ํ•™์Šต์ด ์ œ๋Œ€๋กœ ์ด๋ฃจ์–ด์ง€๊ณ  ์žˆ๋Š”์ง€ ํ™•์ธํ•  ์ˆ˜ ์žˆ๊ณ ,

ํ•˜์ดํผํŒŒ๋ผ๋ฏธํ„ฐ ํŠœ๋‹ ํ›„, ์ด์ „ ๊ฒฐ๊ณผ์™€ ๋น„๊ตํ•จ์œผ๋กœ์จ ํšจ๊ณผ์ ์ธ ์„ค์ • ์กฐํ•ฉ์„ ํƒ์ƒ‰ํ•˜๋Š”๋ฐ ๋„์›€์„ ์ค๋‹ˆ๋‹ค.

 

๋˜, ๋ชจ๋ธ ๊ตฌ์กฐ ๋ฐ ๋‚ด๋ถ€ ์ •๋ณด๋ฅผ ๋ถ„์„ํ•  ์ˆ˜ ์žˆ์–ด, ํŒŒ๋ผ๋ฏธํ„ฐ์˜ ๋ถ„ํฌ์™€ ๋ณ€ํ™”, gradient ๋ณ€ํ™”๋ฅผ ์ถ”์ ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.

๋˜ํ•œ ์—ฐ์‚ฐ ๊ทธ๋ž˜ํ”„๋ฅผ ์‹œ๊ฐํ™” ํ•จ์œผ๋กœ์จ ๋ชจ๋ธ ๊ตฌ์กฐ๋ฅผ ํŒŒ์•…ํ•  ์ˆ˜๋„ ์žˆ์Šต๋‹ˆ๋‹ค.

 

ํŠนํžˆ, ์š”์ฆ˜ ์ด๋ฏธ์ง€, ์Œ์„ฑ, ํ…์ŠคํŠธ ๋“ฑ ๊ณ ์ฐจ์› ๋ฐ์ดํ„ฐ๋ฅผ ์ด์šฉํ•œ ๋ชจ๋ธ๋“ค์ด ๋‹ค์–‘ํ•˜๊ฒŒ ๊ฐœ๋ฐœ ๋˜๊ณ  ์žˆ๋Š”๋ฐ์š”.

Tensorboard์—์„  ๊ณ ์ฐจ์› ๋ฐ์ดํ„ฐ ๋˜ํ•œ ๊ณ ์ฐจ์› ์ž„๋ฒ ๋”ฉ์„ ํ†ตํ•ด ๋ฐ์ดํ„ฐ ๊ฐ„ ๊ด€๊ณ„๋ฅผ ์‹œ๊ฐํ™” ํ•  ์ˆ˜ ์žˆ๊ณ ,

์ด๋ฅผ ์ด์šฉํ•œ ์˜ˆ์ธก ๊ฒฐ๊ณผ๋ฅผ ๊ธฐ๋กํ•  ์ˆ˜๋„ ์žˆ์Šต๋‹ˆ๋‹ค.

 


PyTorch์™€ TensorBoard ์—ฐ๋™ ์‹ค์Šต - ๋”ฅ๋Ÿฌ๋‹

์ €๋Š” ๊ฐ„๋‹จํ•˜๊ฒŒ titanic ๋ฐ์ดํ„ฐ์…‹์„ ๋ถˆ๋Ÿฌ์˜จ ๋‹ค์Œ ๊ฐ„๋‹จํ•œ ๋ชจ๋ธ์„ ์ •์˜ํ•˜์—ฌ, ์‹ค์Šต ์ง„ํ–‰ํ–ˆ์Šต๋‹ˆ๋‹ค.

 

๋ฐ์ดํ„ฐ ์…‹

X = df.drop("Survived", axis=1).values
y = df["Survived"].values
X = StandardScaler().fit_transform(X)

X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)

class TitanicDataset(Dataset):
    def __init__(self, X, y):
        self.X = torch.tensor(X, dtype=torch.float32)
        self.y = torch.tensor(y, dtype=torch.long)
    def __len__(self): return len(self.X)
    def __getitem__(self, idx): return self.X[idx], self.y[idx]

train_loader = DataLoader(TitanicDataset(X_train, y_train), batch_size=32, shuffle=True)
test_loader = DataLoader(TitanicDataset(X_test, y_test), batch_size=32)

 

 

๋ชจ๋ธ ์ •์˜

class TitanicNet(nn.Module):
    def __init__(self):
        super().__init__()
        self.net = nn.Sequential(
            nn.Linear(6, 16),
            nn.ReLU(),
            nn.Linear(16, 2)
        )
    def forward(self, x): return self.net(x)
    
model = TitanicNet().to(device)

 

 

SummaryWriter ์„ค์ •

: summaryWriter ์•ˆ์— ๋กœ๊ทธ๊ฐ€ ์ €์žฅ๋  ๋””๋ ‰ํ† ๋ฆฌ ๊ฒฝ๋กœ๋ฅผ ์ง€์ •ํ•˜์—ฌ, writer์„ ์„ค์ •ํ•˜๋ฉด ๋ฉ๋‹ˆ๋‹ค.

writer = SummaryWriter("runs/nn_titanic")

 

 

๋กœ๊ทธ ๊ธฐ๋ก

: ํ•™์Šต ์ฝ”๋“œ ๋‚ด๋ถ€์— writer.add_scalar()๋ฅผ ์ถ”๊ฐ€ํ•˜์—ฌ, ์›ํ•˜๋Š” ๋กœ๊ทธ๋ฅผ ๊ธฐ๋กํ•˜๋„๋ก ์„ค์ •ํ•ฉ๋‹ˆ๋‹ค.

criterion = nn.CrossEntropyLoss()
optimizer = torch.optim.Adam(model.parameters(), lr=0.001)

for epoch in range(100):
    model.train()
    total_loss = 0
    correct = 0

    for X_batch, y_batch in train_loader:
        X_batch, y_batch = X_batch.to(device), y_batch.to(device)
        output = model(X_batch)
        loss = criterion(output, y_batch)

        optimizer.zero_grad()
        loss.backward()
        optimizer.step()

        total_loss += loss.item()
        pred = output.argmax(1)
        correct += (pred == y_batch).sum().item()

    acc = correct / len(train_loader.dataset)
    
    # writer๋กœ ๋กœ๊ทธ ๊ธฐ๋ก
    writer.add_scalar("Loss/train", total_loss / len(train_loader), epoch)
    writer.add_scalar("Accuracy/train", acc, epoch)

 

 

Writer close

: ๋กœ๊ทธ ์“ฐ๋Š”๊ฒƒ์„ ์ข…๋ฃŒํ•˜๊ธฐ ์œ„ํ•ด ํ•™์Šต์ด ๋๋‚œ ํ›„, ์•„๋ž˜์˜ ์ฝ”๋“œ๋ฅผ ์‹คํ–‰ํ•ฉ๋‹ˆ๋‹ค.

writer.close()

 

Tensorboard ์‹คํ–‰

: ๋กœ๊ทธ๋ฅผ ํ™•์ธํ•  ์ˆ˜ ์žˆ๋„๋ก ์•„๋ž˜์˜ ์ฝ”๋“œ๋ฅผ ์‹คํ–‰ํ•˜๋ฉด ๋ฉ๋‹ˆ๋‹ค!

%load_ext tensorboard
%tensorboard --logdir=runs

 

์‹คํ–‰ํ•˜๋ฉด, ์ด๋Ÿฐ์‹์œผ๋กœ ํ™•์ธํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค!


LGBM๊ณผ TensorBoard ์—ฐ๋™ ์‹ค์Šต - ๋จธ์‹ ๋Ÿฌ๋‹

์ด๋ฒˆ ์‹ค์Šต์—์„œ๋„ titanic ๋ฐ์ดํ„ฐ์…‹์„ ๋ถˆ๋Ÿฌ์˜จ ๋‹ค์Œ ๋จธ์‹ ๋Ÿฌ๋‹์˜ LightCBM ๋ชจ๋ธ์„ ์‚ฌ์šฉํ•˜์—ฌ ์˜ˆ์ธกํ•  ์ˆ˜ ์žˆ๋„๋ก ์ง„ํ–‰ํ•˜์˜€์Šต๋‹ˆ๋‹ค.

๋จธ์‹ ๋Ÿฌ๋‹ ๋ชจ๋ธ์—์„œ๋Š” ํ•™์Šต ๋กœ๊ทธ ๊ธฐ๋ก์ด ์ €์ ˆ๋กœ ๋„˜์–ด๊ฐ€์ง„ ์•Š์ง€๋งŒ ๋กœ๊ทธ ๊ธฐ๋ก์„ ๊ตฌํ˜„ํ•˜์—ฌ ๋™์ผํ•˜๊ฒŒ ์‹œ๊ฐํ™”ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค!

 

๋ฐ์ดํ„ฐ ์…‹

X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)

train_data = lgb.Dataset(X_train, label=y_train)
valid_data = lgb.Dataset(X_test, label=y_test)

 

 

SummaryWriter ์„ค์ •

: summaryWriter ์•ˆ์— ๋กœ๊ทธ๊ฐ€ ์ €์žฅ๋  ๋””๋ ‰ํ† ๋ฆฌ ๊ฒฝ๋กœ๋ฅผ ์ง€์ •ํ•˜์—ฌ, writer์„ ์„ค์ •ํ•˜๋ฉด ๋ฉ๋‹ˆ๋‹ค.

writer = SummaryWriter("runs/nn_titanic")

 

 

๋กœ๊ทธ ๊ธฐ๋ก์„ ์œ„ํ•œ ์ฝœ๋ฐฑ ํ•จ์ˆ˜ ์ •์˜

def tb_callback(env):
    epoch = env.iteration
    train_loss = env.evaluation_result_list[0][2]
    val_loss = env.evaluation_result_list[1][2]
    writer.add_scalar("Loss/train", train_loss, epoch)
    writer.add_scalar("Loss/valid", val_loss, epoch)

 

 

๋ชจ๋ธ ์ •์˜

: ๋ชจ๋ธ train ์ฝ”๋“œ ์•ˆ์— ์ฝœ๋ฐฑ ํ•จ์ˆ˜๋ฅผ ๋„ฃ์–ด, ๋กœ๊ทธ ๊ธฐ๋ก์„ ๊ฐ€๋Šฅํ† ๋ก ํ•ฉ๋‹ˆ๋‹ค.

## ๋ชจ๋ธ ํŒŒ๋ผ๋ฏธํ„ฐ
params = {
    "objective": "binary",
    "metric": "binary_logloss",
    "learning_rate": 0.1,
    "verbosity": -1
}

model = lgb.train(
    params,
    train_data,
    num_boost_round=100,
    valid_sets=[train_data, valid_data],
    valid_names=["train", "valid"],
    ## ์ฝœ๋ฐฑ
    callbacks=[
        tb_callback,
        lgb.log_evaluation(period=1)
    ]
)

 

 

predict ๋กœ๊ทธ ๊ธฐ๋ก

: ์˜ˆ์ธกํ•œ ๊ฒฐ๊ณผ๋กœ ๋‚˜์˜จ ๋กœ๊ทธ๋ฅผ ๊ธฐ๋กํ•˜๊ณ ์ž ํ•  ๋•Œ, model.predict ํ›„, ๊ธฐ๋กํ•˜๋„๋ก  ํ•˜๋ฉด, test ๊ฒฐ๊ณผ๋„ ํ™•์ธ ๊ฐ€๋Šฅํ•ฉ๋‹ˆ๋‹ค!

y_pred = model.predict(X_test)
acc = accuracy_score(y_test, (y_pred >= 0.5).astype(int))
writer.add_scalar("Accuracy/test", acc, 0)

 

 

Writer close

: ๋กœ๊ทธ ์“ฐ๋Š”๊ฒƒ์„ ์ข…๋ฃŒํ•˜๊ธฐ ์œ„ํ•ด ํ•™์Šต์ด ๋๋‚œ ํ›„, ์•„๋ž˜์˜ ์ฝ”๋“œ๋ฅผ ์‹คํ–‰ํ•ฉ๋‹ˆ๋‹ค.

writer.close()

 

 

Tensorboard ์‹คํ–‰

: ๋กœ๊ทธ๋ฅผ ํ™•์ธํ•  ์ˆ˜ ์žˆ๋„๋ก ์•„๋ž˜์˜ ์ฝ”๋“œ๋ฅผ ์‹คํ–‰ํ•˜๋ฉด ๋ฉ๋‹ˆ๋‹ค!

%load_ext tensorboard
%tensorboard --logdir=runs

 

์‹คํ–‰ํ•˜๋ฉด, ์ด๋Ÿฐ์‹์œผ๋กœ ํ™•์ธํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค!


Tensorboard๋ฅผ ์ž˜ ์ด์šฉํ•œ๋‹ค๋ฉด ๋ชจ๋ธ์˜ ์„ฑ๋Šฅ์„ ๋น ๋ฅด๊ฒŒ ๊ฐœ์„ ํ•˜๊ณ ,

์‹คํ—˜์„ ๋ฐ˜๋ณตํ•˜๋ฉด์„œ ์ •ํ™•ํ•œ ๋ฐฉํ–ฅ์œผ๋กœ ๋‚˜์•„๊ฐˆ ์ˆ˜ ์žˆ๊ฒŒ ๋„์›€์„ ์ฃผ๋Š” ์ข‹์€ ๋„๊ตฌ์ž…๋‹ˆ๋‹ค!

๋จธ์‹ ๋Ÿฌ๋‹ ๋ชจ๋ธ์„ ๊ตฌํ˜„ํ•˜๊ฑฐ๋‚˜ ๋”ฅ๋Ÿฌ๋‹ ๋ชจ๋ธ์„ ๊ตฌํ˜„ํ•  ๋•Œ, ํ™œ์šฉํ•˜์—ฌ ์ข‹์€ ๊ฒฐ๊ณผ๋ฅผ ๋งŒ๋“ค์–ด๋ณด์„ธ์š”๐Ÿ˜Ž๐ŸŽถ

๊ฐ์‚ฌํ•ฉ๋‹ˆ๋‹ค๐Ÿ™Œ