Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Output eval logging (batch level) #2977

Merged
Merged
Show file tree
Hide file tree
Changes from 245 commits
Commits
Show all changes
254 commits
Select commit Hold shift + click to select a range
ec6fc17
prelim commit
bmosaicml Sep 12, 2023
a59b644
fix max answer lengths for cot
bmosaicml Sep 12, 2023
97b1218
add output logger
bmosaicml Sep 12, 2023
7174e75
create eval output logger
bmosaicml Sep 12, 2023
fdbd53b
fix pyright; git push
bmosaicml Sep 12, 2023
909d07b
Merge branch 'dev' into error_logging_callback
bmosaicml Sep 13, 2023
9f4e3d2
change dist reduce fx
bmosaicml Sep 13, 2023
dce297c
Merge branch 'error_logging_callback' of github.com:bmosaicml/compose…
bmosaicml Sep 13, 2023
ea4e7ee
change dist reduce fx
bmosaicml Sep 13, 2023
5630c23
fix pyright
bmosaicml Sep 13, 2023
30623f7
Merge branch 'dev' into error_logging_callback
bmosaicml Sep 13, 2023
e161e33
Add nightly docker image (#2452)
j316chuck Aug 23, 2023
743fbe1
Fix local eval (#2465)
rishab-partha Aug 24, 2023
0c333b6
Add torch 2.1.0 args for github release-docker workflow
j316chuck Aug 24, 2023
da4e19f
Log system metrics on each event (#2412)
prithvikannan Aug 24, 2023
60d3dc6
Fix torch 2.1.0 docker tag (#2472)
j316chuck Aug 24, 2023
15385b2
Upstream Generate Callback (#2449)
irenedea Aug 25, 2023
ec59026
Upgrade torch nightly docker image for 0.18.3 NCCL version (#2476)
j316chuck Aug 25, 2023
a5ec1ac
Test pytorch 2.1.0 docker images on ci/cd (#2469)
j316chuck Aug 25, 2023
145aeb8
Fix huggingface tokenizer loading for slow tokenizers (#2483)
dakinggg Aug 28, 2023
816a61b
Deprecate Fused LayerNorm (#2475)
nik-mosaic Aug 28, 2023
de68763
Transformers upgrade (#2489)
dakinggg Aug 29, 2023
c4488b5
Update RTD build config with build.os (#2490)
bandish-shah Aug 29, 2023
d91fe4d
Upgrade torch docker version and github workflow tests (#2488)
j316chuck Aug 29, 2023
3a9706d
upgrade node version (#2492)
j316chuck Aug 29, 2023
ee67e99
Gating tying modules w/ FSDP for torch 2.0 (#2467)
bcui19 Aug 30, 2023
99b98ef
Removing min_params (#2494)
bcui19 Aug 30, 2023
91d961d
Fix torchmetrics backwards compatibility issue (#2468)
eracah Aug 31, 2023
6add304
Adding some fixes to FSDP tests (#2495)
bcui19 Aug 31, 2023
b5e0950
fail count (#2496)
mvpatel2000 Aug 31, 2023
8e106a6
Remove PR curve metrics from backward compatibility test and skip tor…
eracah Aug 31, 2023
fc6c995
filter warning (#2500)
mvpatel2000 Sep 1, 2023
ac60704
bump version (#2498)
mvpatel2000 Sep 1, 2023
9274a77
Skip metrics in state dict (#2501)
mvpatel2000 Sep 1, 2023
d7b49c7
Add peak memory stats (#2504)
mvpatel2000 Sep 1, 2023
c24d60d
fix sharded ckpt (#2505)
mvpatel2000 Sep 1, 2023
4e50192
Bump gitpython from 3.1.31 to 3.1.34 (#2509)
dependabot[bot] Sep 5, 2023
90e8bf2
Annotate `torch_prof_remote_file_name` as Optional (#2512)
srstevenson Sep 5, 2023
dac9054
fix: when there is no train_metrics, do not checkpoint (#2502)
furkanbiten Sep 5, 2023
8dfa2db
Remove metric saving (#2514)
mvpatel2000 Sep 7, 2023
c8f3ecd
Fix daily tests by removing gpu marker (#2515)
j316chuck Sep 7, 2023
284c1b7
Refactor mosaic_fsdp.py (#2506)
b-chu Sep 7, 2023
c507e30
fix pr (#2517)
mvpatel2000 Sep 7, 2023
303b7c3
Add custom sharding to ChunkShardingSpec (#2507)
b-chu Sep 8, 2023
3a19321
Update nightly docker image to torch nightly 09-03-23 (#2518)
j316chuck Sep 8, 2023
4ca8f5a
Update pre-commit in setup.py (#2522)
b-chu Sep 8, 2023
c1f87f7
Add FSDP custom wrap with torch 2.1 (#2460)
mvpatel2000 Sep 8, 2023
decf2b2
Fix GCSObjectStore bug where hmac keys auth doesn't work (#2519)
eracah Sep 9, 2023
b521207
prelim commit
bmosaicml Sep 12, 2023
3b09be7
add output logger
bmosaicml Sep 12, 2023
5697e1f
create eval output logger
bmosaicml Sep 12, 2023
2e01b89
change dist reduce fx
bmosaicml Sep 13, 2023
b3b1377
Bump gitpython from 3.1.34 to 3.1.35 (#2525)
dependabot[bot] Sep 12, 2023
1f5012b
Bump pytest from 7.4.0 to 7.4.2 (#2523)
dependabot[bot] Sep 12, 2023
6bdc53e
Upgrade to mlflow version 2.5.0 (#2528)
ngcgarcia Sep 12, 2023
1818b51
disable cifar daily (#2527)
mvpatel2000 Sep 12, 2023
13d411e
mosaicml logger robustness improvements (#2530)
mvpatel2000 Sep 12, 2023
51650ff
Fix metrics keys sort in DecoupledAdamW for OptimizerMonitor FSDP met…
m1kol Sep 12, 2023
c780740
Fix github actions for GCS integration testing (#2532)
mvpatel2000 Sep 13, 2023
17953f4
change dist reduce fx
bmosaicml Sep 13, 2023
cb0ce0e
fix pyright
bmosaicml Sep 13, 2023
f2dd81f
Fix GCS tests (#2535)
mvpatel2000 Sep 13, 2023
8fea658
merge
bmosaicml Sep 14, 2023
75260fc
finish error logging cb
bmosaicml Sep 14, 2023
8bb395f
Merge branch 'dev' into error_logging_callback
bmosaicml Sep 14, 2023
fada3b5
fix
bmosaicml Sep 14, 2023
0540383
Merge branch 'dev' into error_logging_callback
bmosaicml Sep 18, 2023
0e6e6d8
add import to init
bmosaicml Sep 18, 2023
3668c29
Merge branch 'error_logging_callback' of github.com:bmosaicml/compose…
bmosaicml Sep 18, 2023
c653090
add import to init
bmosaicml Sep 18, 2023
b8be3a2
add import to init
bmosaicml Sep 18, 2023
f309785
add file writing
bmosaicml Sep 18, 2023
33b35af
add file writing
bmosaicml Sep 18, 2023
1a0ef89
add file writing
bmosaicml Sep 18, 2023
8aa77f0
add file writing
bmosaicml Sep 18, 2023
1b7e6db
add file writing
bmosaicml Sep 18, 2023
9c75b53
move tensors to cpu
bmosaicml Sep 19, 2023
7a41a01
remove tensors
bmosaicml Sep 19, 2023
e5c8b61
remove tensors
bmosaicml Sep 19, 2023
a33cbd9
remove tensors
bmosaicml Sep 20, 2023
fa88a05
add prompt to qa
bmosaicml Sep 20, 2023
8111682
add prompt to qa
bmosaicml Sep 20, 2023
6e651fd
add prompt to qa
bmosaicml Sep 20, 2023
501bc0c
add prompt to qa
bmosaicml Sep 20, 2023
0116903
add prompt to qa
bmosaicml Sep 20, 2023
5fa5957
add prompt to qa
bmosaicml Sep 20, 2023
5ffb804
add prompt to qa
bmosaicml Sep 20, 2023
605f437
add prompt to qa
bmosaicml Sep 20, 2023
afaa437
add prompt to qa
bmosaicml Sep 20, 2023
b772029
add prompt to qa
bmosaicml Sep 20, 2023
6f8e0d7
add prompt to qa
bmosaicml Sep 20, 2023
92779c4
add prompt to qa
bmosaicml Sep 21, 2023
1ec300e
add prompt to qa
bmosaicml Sep 21, 2023
cf943f4
add prompt to qa
bmosaicml Sep 22, 2023
be859bb
try debugging dist sync issue
bmosaicml Sep 25, 2023
8ab2b04
nit
bmosaicml Sep 25, 2023
a6999aa
debugging
bmosaicml Sep 25, 2023
828ceec
debugging
bmosaicml Sep 25, 2023
b98ffe6
debugging
bmosaicml Sep 25, 2023
0c61063
debugging
jcd2020 Sep 26, 2023
72a4f2b
debugging
jcd2020 Sep 26, 2023
8cf8829
debugging
jcd2020 Sep 26, 2023
484510c
debugging
jcd2020 Sep 26, 2023
07cbebf
debugging
jcd2020 Sep 26, 2023
e6af285
debugging
jcd2020 Sep 26, 2023
6f55ff5
debugging
jcd2020 Sep 26, 2023
e0d80ab
debugging
jcd2020 Sep 26, 2023
177b935
debugging
jcd2020 Sep 26, 2023
dcfa6de
debugging
jcd2020 Sep 26, 2023
cce1fb0
debugging
jcd2020 Sep 26, 2023
10ab1ca
debugging
jcd2020 Sep 26, 2023
3b3fd26
fix syncing of non tensor state
jcd2020 Sep 26, 2023
ab6d797
Merge branch 'dev' into error_logging_callback
bmosaicml Nov 8, 2023
6266eeb
added gpu test
bmosaicml Nov 8, 2023
cd1fc58
merge
bmosaicml Nov 8, 2023
76882cb
fix error
bmosaicml Nov 9, 2023
2855e1f
finish testing callback
bmosaicml Nov 15, 2023
29a5803
fix all errors
bmosaicml Nov 16, 2023
f56c9de
Merge branch 'dev' into error_logging_callback
bmosaicml Nov 16, 2023
57133a4
test commit
tbarton16 Nov 21, 2023
7196028
roll back test commit
tbarton16 Nov 21, 2023
4410203
Merge branch 'dev' into error_logging_callback
bmosaicml Nov 27, 2023
21e322e
remove ranks
bmosaicml Nov 27, 2023
e4eb7ee
Merge branch 'error_logging_callback' of github.com:bmosaicml/compose…
bmosaicml Nov 27, 2023
61447a2
Merge branch 'error_logging_callback' of github.com:bmosaicml/compose…
bmosaicml Nov 27, 2023
d69bdba
re-tesT
bmosaicml Nov 27, 2023
d999b68
Merge branch 'error_logging_callback' of github.com:bmosaicml/compose…
bmosaicml Nov 27, 2023
c030717
Merge branch 'dev' into error_logging_callback
bmosaicml Dec 5, 2023
ca4d3c4
add custome gen kwargs and stopping on eos token
bmosaicml Dec 14, 2023
1e39623
modify test
bmosaicml Dec 14, 2023
09af753
modify test
bmosaicml Dec 14, 2023
47f3c91
Merge branch 'dev' into pass_on_custom_generation_kwargs
bmosaicml Dec 15, 2023
9f9a6bc
Merge branch 'pass_on_custom_generation_kwargs' into error_logging_ca…
bmosaicml Dec 15, 2023
a3501e9
finish
bmosaicml Dec 18, 2023
fadce0e
finish
bmosaicml Dec 18, 2023
92157da
finish
bmosaicml Dec 18, 2023
d137bbc
finish
bmosaicml Dec 18, 2023
b25da3f
Merge branch 'dev' into pass_on_custom_generation_kwargs
bmosaicml Dec 18, 2023
909ed63
finish pr
bmosaicml Dec 20, 2023
e263b5b
Merge branch 'pass_on_custom_generation_kwargs' of github.com:bmosaic…
bmosaicml Dec 20, 2023
32d6668
Merge branch 'dev' into pass_on_custom_generation_kwargs
bmosaicml Dec 20, 2023
4ff16b4
implement early stop
bmosaicml Dec 20, 2023
7ee0a72
Merge branch 'pass_on_custom_generation_kwargs' into add_custom_stopp…
bmosaicml Dec 20, 2023
bcf002e
Merge branch 'add_custom_stopping_criteria' into error_logging_callback
bmosaicml Dec 20, 2023
83a60b7
add tesT
bmosaicml Dec 20, 2023
a031772
Merge branch 'pass_on_custom_generation_kwargs' into add_custom_stopp…
bmosaicml Dec 20, 2023
e5943d6
merge update
bmosaicml Dec 20, 2023
67b4685
Merge branch 'add_custom_stopping_criteria' of github.com:bmosaicml/c…
bmosaicml Dec 20, 2023
be32781
Merge branch 'add_custom_stopping_criteria' into error_logging_callback
bmosaicml Dec 20, 2023
e512a21
merge
bmosaicml Dec 20, 2023
a1af91a
fix
bmosaicml Dec 22, 2023
5f23b3e
finish
bmosaicml Dec 23, 2023
42fb431
finish
bmosaicml Dec 23, 2023
aa05076
fix bug
bmosaicml Dec 23, 2023
076731d
Merge branch 'add_custom_stopping_criteria' into error_logging_callback
bmosaicml Dec 23, 2023
89669c6
finish
bmosaicml Dec 23, 2023
95a7d28
Merge branch 'error_logging_callback' of github.com:bmosaicml/compose…
bmosaicml Dec 23, 2023
dce4ef0
bug fix
bmosaicml Dec 23, 2023
cb3c69d
add keys
bmosaicml Dec 26, 2023
cea85d4
Merge branch 'add_custom_stopping_criteria' into error_logging_callback
bmosaicml Dec 26, 2023
7371e66
add correcT
bmosaicml Dec 26, 2023
c7f5198
modify sync
bmosaicml Dec 26, 2023
786c64c
diff split
bmosaicml Dec 26, 2023
559beee
Merge branch 'add_custom_stopping_criteria' into error_logging_callback
bmosaicml Dec 26, 2023
7f20954
fix typo
bmosaicml Dec 26, 2023
bd10cdd
Merge branch 'add_custom_stopping_criteria' into error_logging_callback
bmosaicml Dec 26, 2023
adf5bab
edit condition
bmosaicml Dec 26, 2023
3cc2442
broken wip
maxisawesome Jan 31, 2024
5c71eab
design demonstration commit
maxisawesome Jan 31, 2024
dd774fb
simplify pr
maxisawesome Feb 1, 2024
489e9c1
further simplify
maxisawesome Feb 1, 2024
af02fc6
wip
maxisawesome Feb 5, 2024
38aedb7
add comments
maxisawesome Feb 5, 2024
9c9fe9b
add other icl metrics
maxisawesome Feb 7, 2024
88f063d
wip
maxisawesome Feb 8, 2024
4e172fc
change dict method, add more stuff to logging
maxisawesome Feb 8, 2024
328627e
fix typos, change some comments
maxisawesome Feb 8, 2024
8d89708
Merge branch 'mosaicml_dev' into error_logging_callback_in_batch
maxisawesome Feb 8, 2024
33176f3
decode tensors, fix wrong dict key
maxisawesome Feb 8, 2024
fbc75ca
fix mc
maxisawesome Feb 8, 2024
7dffa47
1 to 0 lol
maxisawesome Feb 8, 2024
b38799a
wip linting
maxisawesome Feb 9, 2024
4dbccb6
Merge branch 'dev' into error_logging_callback_in_batch
maxisawesome Feb 9, 2024
637247b
adjust to step logging
maxisawesome Feb 9, 2024
d97cd23
adjust logging names
maxisawesome Feb 12, 2024
12669d5
add mflow, rm batch keys
maxisawesome Feb 12, 2024
0230827
add comments, check for dict in huggingface model update_metric
maxisawesome Feb 15, 2024
13cf569
add user specified logging
maxisawesome Feb 15, 2024
4d79393
move metric_name duplication to update_metric
maxisawesome Feb 15, 2024
642663f
wip fix testing
maxisawesome Feb 16, 2024
0f8909f
fix input shape error
maxisawesome Feb 16, 2024
a1bc29d
rm init
maxisawesome Feb 16, 2024
7e1df22
rm eval_after_all
maxisawesome Feb 16, 2024
ed1fa1c
step=None
maxisawesome Feb 22, 2024
677e686
step=state.timestamp.batch.value
maxisawesome Feb 22, 2024
d8352c3
update name to include step
maxisawesome Feb 22, 2024
25cd65d
merge with dev
maxisawesome Feb 22, 2024
c08e2eb
Merge branch 'mosaicml_dev' into error_logging_callback_in_batch
maxisawesome Feb 22, 2024
8928121
linting, wip on test
maxisawesome Feb 22, 2024
a165e92
fix test
maxisawesome Feb 23, 2024
c2b71a4
pyright wip
maxisawesome Feb 23, 2024
3661ae1
Merge branch 'mosaicml_dev' into error_logging_callback_in_batch
maxisawesome Feb 23, 2024
3bccd00
add non-batch warning
maxisawesome Feb 23, 2024
9e23105
pyright
maxisawesome Feb 23, 2024
5328038
Merge branch 'dev' into error_logging_callback_in_batch
maxisawesome Feb 23, 2024
d95dba5
Merge branch 'dev' into error_logging_callback_in_batch
maxisawesome Feb 26, 2024
f1a8d41
debug
maxisawesome Feb 26, 2024
a7708fb
rm this commit that wasn't the right branch
maxisawesome Feb 26, 2024
6b9e720
Merge branch 'dev' into error_logging_callback_in_batch
maxisawesome Feb 26, 2024
25f1872
Merge branch 'dev' into error_logging_callback_in_batch
maxisawesome Feb 27, 2024
1f1700b
log at the end of training
maxisawesome Feb 28, 2024
eabee96
Merge branch 'error_logging_callback_in_batch' of github.com:maxisawe…
maxisawesome Feb 28, 2024
4ab7d95
rm silly wandb table logging
maxisawesome Feb 28, 2024
4f5fecb
add run_name
maxisawesome Feb 28, 2024
84f6982
add docstring
maxisawesome Feb 28, 2024
86d92bd
add debug logging
maxisawesome Feb 28, 2024
b13bf47
more logging
maxisawesome Feb 28, 2024
bfa9621
rm info logging
maxisawesome Feb 28, 2024
5fa6d54
improve comments
maxisawesome Feb 28, 2024
eea7df4
Update composer/callbacks/eval_output_logging_callback.py
maxisawesome Feb 28, 2024
eb7200c
rm logging bool
maxisawesome Feb 29, 2024
a3419af
Merge branch 'error_logging_callback_in_batch' of github.com:maxisawe…
maxisawesome Feb 29, 2024
d994296
fix logging for schema tasks
maxisawesome Mar 1, 2024
8c89376
Merge branch 'dev' into error_logging_callback_in_batch
maxisawesome Mar 1, 2024
ae83f03
fix schema / mc tasks
maxisawesome Mar 1, 2024
454a174
yapf
maxisawesome Mar 1, 2024
8029ebc
rm reshape
maxisawesome Mar 1, 2024
95f81c8
fix tests
maxisawesome Mar 1, 2024
79fa8bb
cleanup test
maxisawesome Mar 1, 2024
8db97f5
pyright
maxisawesome Mar 1, 2024
b1da147
pyright
maxisawesome Mar 1, 2024
acc3e92
docstring
maxisawesome Mar 1, 2024
bfd76a7
pyright
maxisawesome Mar 1, 2024
04188fd
update tests
maxisawesome Mar 1, 2024
cf83165
rm attention mask requirement
maxisawesome Mar 1, 2024
db14ec5
Merge branch 'dev' into error_logging_callback_in_batch
maxisawesome Mar 1, 2024
a89d12b
Merge branch 'dev' into error_logging_callback_in_batch
maxisawesome Mar 2, 2024
7785168
Merge branch 'dev' into error_logging_callback_in_batch
maxisawesome Mar 3, 2024
6fb823d
Merge branch 'dev' into error_logging_callback_in_batch
maxisawesome Mar 4, 2024
2f63d99
Update composer/metrics/nlp.py
maxisawesome Mar 5, 2024
29951dc
Update composer/metrics/nlp.py
maxisawesome Mar 5, 2024
560e702
rm todo
maxisawesome Mar 6, 2024
8bccb7e
Merge branch 'dev' into error_logging_callback_in_batch
maxisawesome Mar 6, 2024
e3e7b54
Merge branch 'dev' into error_logging_callback_in_batch
maxisawesome Mar 7, 2024
685aa2a
Merge branch 'dev' into error_logging_callback_in_batch
maxisawesome Mar 8, 2024
68bb9d5
Merge branch 'error_logging_callback_in_batch' of github.com:maxisawe…
maxisawesome Mar 8, 2024
87e5b26
lint
mvpatel2000 Mar 8, 2024
dd50335
lint
mvpatel2000 Mar 8, 2024
df16843
lint
mvpatel2000 Mar 8, 2024
747fe35
more lint
mvpatel2000 Mar 8, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions composer/callbacks/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@
from composer.callbacks.activation_monitor import ActivationMonitor
from composer.callbacks.checkpoint_saver import CheckpointSaver
from composer.callbacks.early_stopper import EarlyStopper
from composer.callbacks.eval_output_logging_callback import EvalOutputLogging
from composer.callbacks.export_for_inference import ExportForInferenceCallback
from composer.callbacks.free_outputs import FreeOutputs
from composer.callbacks.generate import Generate
Expand All @@ -35,6 +36,7 @@
'CheckpointSaver',
'MLPerfCallback',
'EarlyStopper',
'EvalOutputLogging',
'ExportForInferenceCallback',
'ThresholdStopper',
'ImageVisualizer',
Expand Down
120 changes: 120 additions & 0 deletions composer/callbacks/eval_output_logging_callback.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,120 @@
# Copyright 2022 MosaicML Composer authors
# SPDX-License-Identifier: Apache-2.0

"""Log model outputs and expected outputs during ICL evaluation."""

import warnings
from copy import deepcopy
from typing import Any, Dict, List, Sequence, Union

import torch

from composer.core import Callback, State
from composer.loggers import ConsoleLogger, Logger
from composer.utils.dist import all_gather_object


class EvalOutputLogging(Callback):
"""Logs eval outputs for each sample of each ICL evaluation dataset.

ICL metrics are required to support caching the model's responses including information on whether model was correct.
Metrics are responsible for returning the results of individual datapoints in a dictionary of lists.
The callback will log the metric name, the depadded and detokenized input, any data stored in state.metric_outputs, and
any keys from the batch pased into `batch_keys_to_log`. It will do so after every eval batch.
"""

def __init__(self, log_tokens=False, *args, **kwargs):
super().__init__(self, *args, **kwargs)
self.log_tokens = log_tokens
self.columns = None
self.name = None
self.rows = []

def eval_batch_end(self, state: State, logger: Logger) -> None:
if not isinstance(state.batch, Dict):
warnings.warn(f'''EvalOutputLogging only supports batches that are dictionary. \
Found batch for type {type(state.batch)}. \
Not logging eval outputs.''')
return

assert state.outputs is not None
assert state.metric_outputs is not None
logging_dict: Dict[str, Union[List[Any], torch.Tensor, Sequence[torch.Tensor]]] = deepcopy(state.metric_outputs)

# If batch mode is not generate, outputs will be logits
if state.batch['mode'] == 'generate':
# Outputs are already detokenized
logging_dict['outputs'] = state.outputs

input_ids = state.batch['input_ids']
logged_input = []
assert state.dataloader is not None

# Depad and decode input_ids
for input_list in input_ids.tolist():
depadded_input = [
tok for tok in input_list
if tok != state.dataloader.dataset.pad_tok_id # pyright: ignore[reportGeneralTypeIssues]
]
logged_input.append(
state.dataloader.dataset.tokenizer.decode(depadded_input)) # pyright: ignore[reportGeneralTypeIssues]
logging_dict['input'] = logged_input

# Log token indices if toggled
if self.log_tokens:
logging_dict['input_tokens'] = input_ids.tolist()
if not state.batch['mode'] == 'generate':
if isinstance(state.outputs, torch.Tensor): # pyright
logging_dict['label_tokens'] = state.outputs.tolist()

# Add run_name as a column
run_name_list = [state.run_name for _ in range(0, len(logging_dict['input']))]
logging_dict['run_name'] = run_name_list

# NOTE: This assumes _any_ tensor logged are tokens to be decoded.
# This might not be true if, for example, logits are logged.

# Detokenize data in rows
for key, value in logging_dict.items():
# All types in list are the same
if isinstance(value[0], torch.Tensor):
logging_dict[key] = [
state.dataloader.dataset.tokenizer.decode(t) # pyright: ignore[reportGeneralTypeIssues]
for t in value
]
elif isinstance(value[0], list):
if isinstance(value[0][0], torch.Tensor):
maxisawesome marked this conversation as resolved.
Show resolved Hide resolved
logging_dict[key] = [
[
state.dataloader.dataset.tokenizer.decode( # pyright: ignore[reportGeneralTypeIssues]
choice) for choice in t
] for t in value
]

# Convert logging_dict from kv pairs of column name and column values to a list of rows
# Example:
# logging_dict = {"a": ["1a", "2a"], "b": ["1b", "2b"]}
# will become
# columns = {"a", "b"}, rows = [["1a", "1b"], ["2a", "2b"]]
columns = list(logging_dict.keys())
rows = [list(item) for item in zip(*logging_dict.values())]
maxisawesome marked this conversation as resolved.
Show resolved Hide resolved

assert state.dataloader_label is not None
if not self.name:
# If only running eval, step will be 0
# If running training, step will be current training step
step = state.timestamp.batch.value
self.name = f'{state.dataloader_label}_step_{step}'
self.columns = columns
maxisawesome marked this conversation as resolved.
Show resolved Hide resolved
self.rows.extend(rows)

def eval_end(self, state: State, logger: Logger) -> None:
list_of_rows = all_gather_object(self.rows)
rows = [row for rows in list_of_rows for row in rows]
for dest_logger in logger.destinations:
if not isinstance(dest_logger, ConsoleLogger):
dest_logger.log_table(self.columns, rows, name=self.name, step=state.timestamp.batch.value)

self.rows = []
self.name = None
self.columns = None
2 changes: 2 additions & 0 deletions composer/core/state.py
Original file line number Diff line number Diff line change
Expand Up @@ -518,6 +518,8 @@ def __init__(
self.eval_metric_values: Dict[str, float] = {}
self.total_loss_dict: Dict[str, float] = {}

self.metric_outputs: Dict[str, Any] = {}

def _dataset_of(self, dataloader: Optional[Union[Evaluator, DataSpec, DataLoader, Iterable]]) -> Optional[Dataset]:
"""Get the dataset contained by the given dataloader-like object.

Expand Down
6 changes: 4 additions & 2 deletions composer/loggers/in_memory_logger.py
Original file line number Diff line number Diff line change
Expand Up @@ -83,8 +83,10 @@ def log_table(self,
raise MissingConditionalImportError(extra_deps_group='pandas',
conda_package='pandas',
conda_channel='conda-forge') from e
table = pd.DataFrame.from_records(data=rows, columns=columns).to_json(orient='split', index=False)
assert isinstance(table, str)
table = pd.DataFrame.from_records(data=rows, columns=columns).to_json(orient='split',
index=False,
force_ascii=False)
mvpatel2000 marked this conversation as resolved.
Show resolved Hide resolved
assert table is not None
self.tables[name] = table

def log_metrics(self, metrics: Dict[str, Any], step: Optional[int] = None) -> None:
Expand Down
4 changes: 3 additions & 1 deletion composer/loggers/wandb_logger.py
Original file line number Diff line number Diff line change
Expand Up @@ -109,6 +109,8 @@ def __init__(
self.run_dir: Optional[str] = None
self.run_url: Optional[str] = None

self.table_dict = {}

def _set_is_in_atexit(self):
self._is_in_atexit = True

Expand All @@ -125,7 +127,7 @@ def log_table(self,
if self._enabled:
import wandb
table = wandb.Table(columns=columns, rows=rows)
wandb.log({name: table}, step)
wandb.log({name: table}, step=step)

def log_metrics(self, metrics: Dict[str, Any], step: Optional[int] = None) -> None:
if self._enabled:
Expand Down
Loading
Loading