Skip to content

Conversation

@sourcery-ai
Copy link

@sourcery-ai sourcery-ai bot commented Jul 26, 2023

Branch master refactored by Sourcery.

If you're happy with these changes, merge this Pull Request using the Squash and merge strategy.

See our documentation here.

Run Sourcery locally

Reduce the feedback loop during development by using the Sourcery editor plugin:

Review changes via command line

To manually merge these changes, make sure you're on the master branch, then run:

git fetch origin sourcery/master
git merge --ff-only FETCH_HEAD
git reset HEAD^

Help us improve this pull request!

@sourcery-ai sourcery-ai bot requested a review from theonesud July 26, 2023 13:35
Copy link
Author

@sourcery-ai sourcery-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Due to GitHub API limits, only the first 60 comments can be shown.

Comment on lines -58 to +60
self.criterions = nn.ModuleList([nn.CrossEntropyLoss(size_average=False) for i in self.cutoff])
self.criterions = nn.ModuleList(
[nn.CrossEntropyLoss(size_average=False) for _ in self.cutoff]
)
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function AdaptiveLoss.__init__ refactored with the following changes:

x = F.sigmoid(x)
else:
x = F.log_softmax(x)
x = F.sigmoid(x) if self.is_multi else F.log_softmax(x)
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function MixedInputModel.forward refactored with the following changes:


if f in model_meta: cut,self.lr_cut = model_meta[f]
else: cut,self.lr_cut = 0,0
cut,self.lr_cut = model_meta[f] if f in model_meta else (0, 0)
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function ConvnetBuilder.__init__ refactored with the following changes:

Comment on lines -183 to +181
if self.data.test_dl: predict_to_bcolz(m, self.data.test_dl, test_act)

predict_to_bcolz(m, self.data.test_dl, test_act)
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function ConvLearner.save_fc1 refactored with the following changes:

def datafy(x):
if is_listy(x): return [o.data for o in x]
else: return x.data
return [o.data for o in x] if is_listy(x) else x.data
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function datafy refactored with the following changes:

Comment on lines -43 to +45
try:
for i in range(len(inst)): inst.pop().close()
except Exception:
pass
with contextlib.suppress(Exception):
for _ in range(len(inst)):
inst.pop().close()
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function clear_tqdm refactored with the following changes:

Comment on lines -51 to +55
if 'betas' in self.opt.param_groups[0]:
for pg in self.opt.param_groups: pg['betas'] = (momentum, pg['betas'][1])
else:
for pg in self.opt.param_groups: pg['momentum'] = momentum
for pg in self.opt.param_groups:
if 'betas' in self.opt.param_groups[0]:
pg['betas'] = (momentum, pg['betas'][1])
else:
pg['momentum'] = momentum
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function LayerOptimizer.set_mom refactored with the following changes:

Comment on lines -57 to +62
if 'betas' in self.opt.param_groups[0]:
for pg in self.opt.param_groups: pg['betas'] = (pg['betas'][0],beta)
elif 'alpha' in self.opt.param_groups[0]:
for pg in self.opt.param_groups: pg['alpha'] = beta
for pg in self.opt.param_groups:
if 'betas' in self.opt.param_groups[0]:
pg['betas'] = (pg['betas'][0],beta)
elif 'alpha' in self.opt.param_groups[0]:
pg['alpha'] = beta
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function LayerOptimizer.set_beta refactored with the following changes:

Comment on lines -98 to +99
def get_model_path(self, name): return os.path.join(self.models_path,name)+'.h5'
def get_model_path(self, name):
return f'{os.path.join(self.models_path, name)}.h5'
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function Learner.get_model_path refactored with the following changes:

Comment on lines -102 to +104
if hasattr(self, 'swa_model'): save_model(self.swa_model, self.get_model_path(name)[:-3]+'-swa.h5')
if hasattr(self, 'swa_model'):
save_model(self.swa_model, f'{self.get_model_path(name)[:-3]}-swa.h5')
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function Learner.save refactored with the following changes:

Comment on lines -106 to +109
if hasattr(self, 'swa_model'): load_model(self.swa_model, self.get_model_path(name)[:-3]+'-swa.h5')
if hasattr(self, 'swa_model'):
load_model(self.swa_model, f'{self.get_model_path(name)[:-3]}-swa.h5')
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function Learner.load refactored with the following changes:

Comment on lines -388 to +392
if not isinstance(arr, np.ndarray): raise OSError(f'Not valid numpy array')
if not isinstance(arr, np.ndarray):
raise OSError('Not valid numpy array')
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function Learner.predict_array refactored with the following changes:

Comment on lines -414 to +421
preds2 = [predict_with_targs(self.model, dl2)[0] for i in tqdm(range(n_aug), leave=False)]
preds2 = [
predict_with_targs(self.model, dl2)[0]
for _ in tqdm(range(n_aug), leave=False)
]
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function Learner.TTA refactored with the following changes:

Comment on lines -82 to +84
self.dropouths = nn.ModuleList([LockedDropout(dropouth) for l in range(n_layers)])
self.dropouths = nn.ModuleList(
[LockedDropout(dropouth) for _ in range(n_layers)]
)
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function RNN_Encoder.__init__ refactored with the following changes:

Comment on lines +30 to +45

import numpy as np
import torch
import torch.nn.init
import torch.nn as nn

gg = {}
gg['hook_position'] = 0
gg['total_fc_conv_layers'] = 0
gg['done_counter'] = -1
gg['hook'] = None
gg['act_dict'] = {}
gg['counter_to_apply_correction'] = 0
gg['correction_needed'] = False
gg['current_coef'] = 1.0
gg = {
'hook_position': 0,
'total_fc_conv_layers': 0,
'done_counter': -1,
'hook': None,
'act_dict': {},
'counter_to_apply_correction': 0,
'correction_needed': False,
'current_coef': 1.0,
}
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Lines 35-43 refactored with the following changes:

Comment on lines -324 to +323
for i in range(len(self)):
for _ in range(len(self)):
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function TextDataLoader.__iter__ refactored with the following changes:

w = getattr(self.module, name_w)
del self.module._parameters[name_w]
self.module.register_parameter(name_w + '_raw', nn.Parameter(w.data))
self.module.register_parameter(f'{name_w}_raw', nn.Parameter(w.data))
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function WeightDrop._setup refactored with the following changes:

"""
for name_w in self.weights:
raw_w = getattr(self.module, name_w + '_raw')
raw_w = getattr(self.module, f'{name_w}_raw')
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function WeightDrop._setweights refactored with the following changes:

Comment on lines -180 to +201

if IS_TORCH_04:
X = F.embedding(words,
masked_embed_weight, padding_idx, self.embed.max_norm,
self.embed.norm_type, self.embed.scale_grad_by_freq, self.embed.sparse)
else:
X = self.embed._backend.Embedding.apply(words,
masked_embed_weight, padding_idx, self.embed.max_norm,
self.embed.norm_type, self.embed.scale_grad_by_freq, self.embed.sparse)

return X

return (
F.embedding(
words,
masked_embed_weight,
padding_idx,
self.embed.max_norm,
self.embed.norm_type,
self.embed.scale_grad_by_freq,
self.embed.sparse,
)
if IS_TORCH_04
else self.embed._backend.Embedding.apply(
words,
masked_embed_weight,
padding_idx,
self.embed.max_norm,
self.embed.norm_type,
self.embed.scale_grad_by_freq,
self.embed.sparse,
)
)
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function EmbeddingDropout.forward refactored with the following changes:

if self.iteration == self.nb:
return True
return super().on_batch_end(loss)
return True if self.iteration == self.nb else super().on_batch_end(loss)
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function LR_Finder2.on_batch_end refactored with the following changes:

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

0 participants