General Idea > White Board Session > High Level Screen Flows (this) > Individual Wireframes > High-Fidelity UI > Coding.
These are useful tools to step back from individual screens and think of the broader ecosystem of the feature the team is trying to build. If actions on one page affect another page further down a flow, it's easy to reference that in a meeting by having it all laid out in a lower-fidelity, non-distracting way. For example: "if I add a user to this group, where will that user and her derivative information pop up across the experience?" becomes an easier discussion with an artifact like this.
I find when working early on with multi-stakeholder or multi-department initiatives, some high-level UX documentation can be helpful in ensuring when everyone goes back to their desks, each group has the same, general picture in their head.
It runs on a small embedded device that can stream zip archives many times larger then the disk or system ram without any issue.
Example Python Falcon Proof of Concept:
https://gist.github.com/kylemanna/1e22bbf31b7e5ae84bbdfa32c6...
Other then what Python's zipfile buffers in memory, my implementation shouldn't use much more then a os.pipe()'s buffer (typically 64kB?).
I need to open a very large CSV file in Python, which is around 25GB in .zip format. Any idea how to do this in a streaming way, i.e. stopping after reading the first few thousand rows?