notbugAs an Amazon Associate I earn from qualifying purchases.
Want a good read? Try FreeBSD Mastery: Jails (IT Mastery Book 15)
Want a good monitor light? See my photosAll times are UTC
Ukraine
This referral link gives you 10% off a Fastmail.com account and gives me a discount on my Fastmail account.

Get notified when packages are built

A new feature has been added. FreshPorts already tracks package built by the FreeBSD project. This information is displayed on each port page. You can now get an email when FreshPorts notices a new package is available for something on one of your watch lists. However, you must opt into that. Click on Report Subscriptions on the right, and New Package Notification box, and click on Update.

Finally, under Watch Lists, click on ABI Package Subscriptions to select your ABI (e.g. FreeBSD:14:amd64) & package set (latest/quarterly) combination for a given watch list. This is what FreshPorts will look for.

non port: math/py-autograd/pkg-descr

Number of commits found: 2

Wednesday, 7 Sep 2022
21:58 Stefan Eßer (se) search for other commits by this committer
Remove WWW entries moved into port Makefiles

Commit b7f05445c00f has added WWW entries to port Makefiles based on
WWW: lines in pkg-descr files.

This commit removes the WWW: lines of moved-over URLs from these
pkg-descr files.

Approved by:		portmgr (tcberner)
commit hash: fb16dfecae4a6efac9f3a78e0b759fb7a3c53de4 commit hash: fb16dfecae4a6efac9f3a78e0b759fb7a3c53de4 commit hash: fb16dfecae4a6efac9f3a78e0b759fb7a3c53de4 commit hash: fb16dfecae4a6efac9f3a78e0b759fb7a3c53de4 fb16dfe
Wednesday, 27 Feb 2019
22:11 rm search for other commits by this committer
Autograd can automatically differentiate native Python and Numpy code. It can
handle a large subset of Python's features, including loops, ifs, recursion and
closures, and it can even take derivatives of derivatives of derivatives. It
supports reverse-mode differentiation (a.k.a. backpropagation), which means it
can efficiently take gradients of scalar-valued functions with respect to
array-valued arguments, as well as forward-mode differentiation, and the two
can be composed arbitrarily. The main intended application of Autograd is
gradient-based optimization.

WWW: https://github.com/HIPS/autograd
Original commitRevision:494091 

Number of commits found: 2