-
Notifications
You must be signed in to change notification settings - Fork 0
Sourcery refactored master branch #1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Due to GitHub API limits, only the first 60 comments can be shown.
| self.criterions = nn.ModuleList([nn.CrossEntropyLoss(size_average=False) for i in self.cutoff]) | ||
| self.criterions = nn.ModuleList( | ||
| [nn.CrossEntropyLoss(size_average=False) for _ in self.cutoff] | ||
| ) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Function AdaptiveLoss.__init__ refactored with the following changes:
- Replace unused for index with underscore (
for-index-underscore)
| x = F.sigmoid(x) | ||
| else: | ||
| x = F.log_softmax(x) | ||
| x = F.sigmoid(x) if self.is_multi else F.log_softmax(x) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Function MixedInputModel.forward refactored with the following changes:
- Replace if statement with if expression (
assign-if-exp)
|
|
||
| if f in model_meta: cut,self.lr_cut = model_meta[f] | ||
| else: cut,self.lr_cut = 0,0 | ||
| cut,self.lr_cut = model_meta[f] if f in model_meta else (0, 0) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Function ConvnetBuilder.__init__ refactored with the following changes:
- Replace if statement with if expression [×2] (
assign-if-exp)
| if self.data.test_dl: predict_to_bcolz(m, self.data.test_dl, test_act) | ||
|
|
||
| predict_to_bcolz(m, self.data.test_dl, test_act) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Function ConvLearner.save_fc1 refactored with the following changes:
- Remove redundant conditional (
remove-redundant-if)
| def datafy(x): | ||
| if is_listy(x): return [o.data for o in x] | ||
| else: return x.data | ||
| return [o.data for o in x] if is_listy(x) else x.data |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Function datafy refactored with the following changes:
- Replace if statement with if expression (
assign-if-exp)
| try: | ||
| for i in range(len(inst)): inst.pop().close() | ||
| except Exception: | ||
| pass | ||
| with contextlib.suppress(Exception): | ||
| for _ in range(len(inst)): | ||
| inst.pop().close() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Function clear_tqdm refactored with the following changes:
- Use
contextlib'ssuppressmethod to silence an error (use-contextlib-suppress) - Replace unused for index with underscore (
for-index-underscore)
| if 'betas' in self.opt.param_groups[0]: | ||
| for pg in self.opt.param_groups: pg['betas'] = (momentum, pg['betas'][1]) | ||
| else: | ||
| for pg in self.opt.param_groups: pg['momentum'] = momentum | ||
| for pg in self.opt.param_groups: | ||
| if 'betas' in self.opt.param_groups[0]: | ||
| pg['betas'] = (momentum, pg['betas'][1]) | ||
| else: | ||
| pg['momentum'] = momentum |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Function LayerOptimizer.set_mom refactored with the following changes:
- Hoist for/while loops out of nested conditionals (
hoist-loop-from-if)
| if 'betas' in self.opt.param_groups[0]: | ||
| for pg in self.opt.param_groups: pg['betas'] = (pg['betas'][0],beta) | ||
| elif 'alpha' in self.opt.param_groups[0]: | ||
| for pg in self.opt.param_groups: pg['alpha'] = beta | ||
| for pg in self.opt.param_groups: | ||
| if 'betas' in self.opt.param_groups[0]: | ||
| pg['betas'] = (pg['betas'][0],beta) | ||
| elif 'alpha' in self.opt.param_groups[0]: | ||
| pg['alpha'] = beta |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Function LayerOptimizer.set_beta refactored with the following changes:
- Hoist for/while loops out of nested conditionals (
hoist-loop-from-if)
| def get_model_path(self, name): return os.path.join(self.models_path,name)+'.h5' | ||
| def get_model_path(self, name): | ||
| return f'{os.path.join(self.models_path, name)}.h5' |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Function Learner.get_model_path refactored with the following changes:
- Use f-string instead of string concatenation (
use-fstring-for-concatenation)
| if hasattr(self, 'swa_model'): save_model(self.swa_model, self.get_model_path(name)[:-3]+'-swa.h5') | ||
| if hasattr(self, 'swa_model'): | ||
| save_model(self.swa_model, f'{self.get_model_path(name)[:-3]}-swa.h5') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Function Learner.save refactored with the following changes:
- Use f-string instead of string concatenation (
use-fstring-for-concatenation)
| if hasattr(self, 'swa_model'): load_model(self.swa_model, self.get_model_path(name)[:-3]+'-swa.h5') | ||
| if hasattr(self, 'swa_model'): | ||
| load_model(self.swa_model, f'{self.get_model_path(name)[:-3]}-swa.h5') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Function Learner.load refactored with the following changes:
- Use f-string instead of string concatenation (
use-fstring-for-concatenation)
| if not isinstance(arr, np.ndarray): raise OSError(f'Not valid numpy array') | ||
| if not isinstance(arr, np.ndarray): | ||
| raise OSError('Not valid numpy array') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Function Learner.predict_array refactored with the following changes:
- Replace f-string with no interpolated values with string (
remove-redundant-fstring)
| preds2 = [predict_with_targs(self.model, dl2)[0] for i in tqdm(range(n_aug), leave=False)] | ||
| preds2 = [ | ||
| predict_with_targs(self.model, dl2)[0] | ||
| for _ in tqdm(range(n_aug), leave=False) | ||
| ] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Function Learner.TTA refactored with the following changes:
- Replace unused for index with underscore (
for-index-underscore)
| self.dropouths = nn.ModuleList([LockedDropout(dropouth) for l in range(n_layers)]) | ||
| self.dropouths = nn.ModuleList( | ||
| [LockedDropout(dropouth) for _ in range(n_layers)] | ||
| ) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Function RNN_Encoder.__init__ refactored with the following changes:
- Replace unused for index with underscore (
for-index-underscore)
|
|
||
| import numpy as np | ||
| import torch | ||
| import torch.nn.init | ||
| import torch.nn as nn | ||
|
|
||
| gg = {} | ||
| gg['hook_position'] = 0 | ||
| gg['total_fc_conv_layers'] = 0 | ||
| gg['done_counter'] = -1 | ||
| gg['hook'] = None | ||
| gg['act_dict'] = {} | ||
| gg['counter_to_apply_correction'] = 0 | ||
| gg['correction_needed'] = False | ||
| gg['current_coef'] = 1.0 | ||
| gg = { | ||
| 'hook_position': 0, | ||
| 'total_fc_conv_layers': 0, | ||
| 'done_counter': -1, | ||
| 'hook': None, | ||
| 'act_dict': {}, | ||
| 'counter_to_apply_correction': 0, | ||
| 'correction_needed': False, | ||
| 'current_coef': 1.0, | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Lines 35-43 refactored with the following changes:
- Merge dictionary assignment with declaration [×8] (
merge-dict-assign)
| for i in range(len(self)): | ||
| for _ in range(len(self)): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Function TextDataLoader.__iter__ refactored with the following changes:
- Replace unused for index with underscore (
for-index-underscore)
| w = getattr(self.module, name_w) | ||
| del self.module._parameters[name_w] | ||
| self.module.register_parameter(name_w + '_raw', nn.Parameter(w.data)) | ||
| self.module.register_parameter(f'{name_w}_raw', nn.Parameter(w.data)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Function WeightDrop._setup refactored with the following changes:
- Use f-string instead of string concatenation (
use-fstring-for-concatenation)
| """ | ||
| for name_w in self.weights: | ||
| raw_w = getattr(self.module, name_w + '_raw') | ||
| raw_w = getattr(self.module, f'{name_w}_raw') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Function WeightDrop._setweights refactored with the following changes:
- Use f-string instead of string concatenation (
use-fstring-for-concatenation)
|
|
||
| if IS_TORCH_04: | ||
| X = F.embedding(words, | ||
| masked_embed_weight, padding_idx, self.embed.max_norm, | ||
| self.embed.norm_type, self.embed.scale_grad_by_freq, self.embed.sparse) | ||
| else: | ||
| X = self.embed._backend.Embedding.apply(words, | ||
| masked_embed_weight, padding_idx, self.embed.max_norm, | ||
| self.embed.norm_type, self.embed.scale_grad_by_freq, self.embed.sparse) | ||
|
|
||
| return X | ||
|
|
||
| return ( | ||
| F.embedding( | ||
| words, | ||
| masked_embed_weight, | ||
| padding_idx, | ||
| self.embed.max_norm, | ||
| self.embed.norm_type, | ||
| self.embed.scale_grad_by_freq, | ||
| self.embed.sparse, | ||
| ) | ||
| if IS_TORCH_04 | ||
| else self.embed._backend.Embedding.apply( | ||
| words, | ||
| masked_embed_weight, | ||
| padding_idx, | ||
| self.embed.max_norm, | ||
| self.embed.norm_type, | ||
| self.embed.scale_grad_by_freq, | ||
| self.embed.sparse, | ||
| ) | ||
| ) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Function EmbeddingDropout.forward refactored with the following changes:
- Replace if statement with if expression (
assign-if-exp) - Inline variable that is immediately returned (
inline-immediately-returned-variable)
| if self.iteration == self.nb: | ||
| return True | ||
| return super().on_batch_end(loss) | ||
| return True if self.iteration == self.nb else super().on_batch_end(loss) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Function LR_Finder2.on_batch_end refactored with the following changes:
- Lift code into else after jump in control flow (
reintroduce-else) - Replace if statement with if expression (
assign-if-exp)
Branch
masterrefactored by Sourcery.If you're happy with these changes, merge this Pull Request using the Squash and merge strategy.
See our documentation here.
Run Sourcery locally
Reduce the feedback loop during development by using the Sourcery editor plugin:
Review changes via command line
To manually merge these changes, make sure you're on the
masterbranch, then run:Help us improve this pull request!