We use a loose hash to match incoming requests with recorded flows. At the
moment, this hash is over the host, port, scheme, method, path and content of
the request. Note that headers are not included here - if we do want to include
headers, we would have to do some work to normalize them to remove variations
between user agents, header order, etc. etc.
This means that certificates don't accumulate in the conf directory, users
don't have to clear certificates if the CA is regenerated, and the user can
specify a custom CA without invalid certificates being loaded inadvertently.
- Move option parsing utiliities to proxy.py
- Don't have a global config object. Pass it as an argument to ProxyServer.
- Simplify certificate generation logic.
- Use templates for config files. We can re-introduce customization of the
certificate attributes when we need them.
- Split CA and cert generation into separate functions.
- Generation methods provide an error return when generation fails.
- When the user explicitly specifies a certificate, we don't generate it, but
fail if it doesn't exist.
This is a big patch removing the assumption that there's one connection per
Request/Response pair. It touches pretty much every part of mitmproxy, so
expect glitches until everything is ironed out.
Module is in the Public Domain.
I expect to modify and extend this module, so I've imported into main library
rather than contrib. Code has been reformatted to suite our code standard,
tests have been extrated into /tests directory.
External scripts can read a flow, modify it, and then return it to mitmproxy
using a simple API.
The "|" keyboard shortcut within mitmproxy prompts the user for a script.
Also, since BeautifulSoup is so damn slow, print a statusbar message saying
that we're calculating a pretty version of the response. Maybe I should add
hangman or something, becuase on a 200k document this can take ages.