Activation Function

    0

    0

    lucycodes42

    Pytorch Public Recipes

    Library: pytorch

    Shortcut: pytorch.f.activation

    F.threshold(input\, threshold\, value\, inplace=False),relu(input\, inplace=False),relu6(input\, inplace=False),hardtanh(input\, min_val=-1.\, max_val=1.\, inplace=False),elu(input\, alpha=1.0\, inplace=False),selu(input\, inplace=False),celu(input\, alpha=1.\, inplace=False),leaky_relu(input\, negative_slope=0.01\, inplace=False),prelu(input\, weight),rrelu(input\, lower=1./8\, upper=1./3\, training=False\, inplace=False),glu(input\, dim=-1),logsigmoid(input),hardshrink(input\, lambd=0.5),tanhshrink(input),softsign(input),softplus(input\, beta=1\, threshold=20),softmin(input\, dim=None\, _stacklevel=3),softmax(input\, dim=None\, _stacklevel=3),softshrink(input\, lambd=0.5),gumbel_softmax(logits\, tau=1\, hard=False\, eps=1e-10),log_softmax(input\, dim=None\, _stacklevel=3),tanh(input),sigmoid(input)
    Codiga Logo
    Codiga Hub
    • Rulesets
    • Playground
    • Snippets
    • Cookbooks
    Legal
    • Security
    • Privacy Policy
    • Code Privacy
    • Terms of Service
    soc-2 icon

    We are SOC-2 Compliance Certified

    G2 high performer medal

    Codiga – All rights reserved 2022.