Skip to content

Pruning AlexNet  #2

@pindapuj

Description

@pindapuj

Hi, thank you for your great work!

I know its a bit of long shot, but I was wondering if you had any insights on a strange problem I come across when pruning alexNet.

Specifically, I'm trying to use this code to prune AlexNet. I'd tried a variety of learning rates, but invariably, the following happens: The training and testing accuracy is increasing, and the SNR is dropping drops towards 1. However, the layerwise sparsity remains 0 across all layers while the SNR > 1. Then, immediately after SNR < 1, the training accuracy immediately plummets to around ~1%, and does not recover. However, the training accuracy remains high.

I was wondering if you had an insights on why this may be happening. I'm waiting until sparsity (layerwise_sparsity) > 0.0 so I can see some pruning, but this comes at a huge, sudden accuracy loss. Am I using the wrong stopping criterion here, learning rate etc? -- Any insights on what could be going wrong would be deeply appreciated!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions