I want something like modulefinder that I can point at a target that outputs a subset of deps/requirements (so I can tree shake requirements in a monorepo).
Does such a tool exist? (before I attempt to write one)
I just published 0.9.0 version of logmerger, compatible with Python 3.13. Here is an example merge of a client and server log, plus PCAP packet capture file showing TCP send/receive traffic. Uses textual TUI framework for cursor and mouse interaction in a terminal session. https://pypi.org/project/logmerger#python#tcp#networking#textual
Problems: @pydantic is great for modeling data!! but at the moment it doesn't support array data out of the box. Often array shape and dtype are as important as whether something is an array at all, but there isn't a good way to specify and validate that with the Python type system. Many data formats and standards couple their implementation very tightly with their schema, making them less flexible, less interoperable, and more difficult to maintain than they could be. The existing tools for parameterized array types like nptyping and jaxtyping tie their annotations to a specific array library, rather than allowing array specifications that can be abstract across implementations.
numpydantic is a super small, few-dep, and well-tested package that provides generic array annotations for pydantic models. Specify an array along with its shape and dtype and then use that model with any array library you'd like! Extending support for new array libraries is just subclassing - no PRs or monkeypatching needed. The type has some magic under the hood that uses pydantic validators to give a uniform array interface to things that don't usually behave like arrays - pass a path to a video file, that's an array. pass a path to an HDF5 file and a nested array within it, that's an array. We take advantage of the rest of pydantic's features too, including generating rich JSON schema and smart array dumping.
This is a standalone part of my work with @linkmlarrays and rearchitecting neurobio data formats like NWB to be dead simple to use and extend, integrating with the tools you already use and across the experimental process - specify your data in a simple yaml format, and get back high quality data modeling code that is standards-compliant out of the box and can be used with arbitrary backends. One step towards the wild exuberance of FAIR data that is just as comfortable in the scattered scripts of real experimental work as it is in carefully curated archives and high performance computing clusters. Longer term I'm trying to abstract away data store implementations to bring content-addressed p2p data stores right into the python interpreter as simply as if something was born in local memory.
Here is a short e-book with a sequence of tutorials on the scientific Python ecosystem for beginners. This includes topics such as:
✅ Working with numerical data using NumPy
✅ Data visualization with Matplotlib
✅ Scientific computing with SciPy
✅ Statistics with Python
✅ Machine learning with scikit-learn
🚨 🚨 🚨 We're approaching the Final Call for Proposals for #PyOhio 2024!!! 🚨 🚨 🚨
This Sunday, Anywhere on Earth (AoE) Will be your last chance to submit a talk for our awesome conference!
If you had fun at #PyCon and want to keep hanging out with the #Python Community, or have something you want to share with the rest of us, please submit a talk! We love first time speakers!
This is the first year that after the #PyConUS sprints, I find myself scanning the recent issues and pull requests on CPython's repository to watch the improvements happen on a Python feature in real-time. ⏳
I was planning to wait until the next beta to re-install Python 3.13, but I had to try it out again yesterday after seeing some fixes land. 💗
I'm not a #Python core developer and I'm not usually an early adopter, but I am so excited for each new improvement in the new Python REPL. 🎉
You can wrestle with scientific-formatting yourself, or you can use the sciform package from Justin Gerber!
sciform is used to convert python numbers into strings according to a variety of user-selected scientific formatting options including decimal, binary, fixed-point, scientific and engineering formats, using documented standards wherever possible!