Yes, there's some lovely mis-match between Mapping and Dict type to work
around. Mapping complains about lack of __delitem__, so saying it's a
Dict instead and ensuring that matches at the call site.
* As we're now not sending *all* journal events on the journal schema
this refactors only the actual sending in export_journal_entry(), with
the other functionality in export_journal_generic(), so that the new
export_journal_fssdiscoveryscan() can use export_journal_entry() as
well.
In the future EDDN will move to a schema per event type, so we'll lose
_generic(). Factoring out some things like augmentations into their own
functions will come next.
In general, doing things like this on import is bad, but this was
changed specifically to remove a bug that causes
--force-localserver-auth to do nothing.
That is caused because while we were careful not to import protocol
until after we were done doing arg things, we did not check to make sure
something else didn't. Companion imports protocol, which thus always
instantiates WindowsProtocolHandler before we can modify some config
state to indicate that we want LinuxProtocolHandler.
This also entailed slightly reworking the way the EDDN code uses this
URL. It was very generalised, so as to allow for the debug "just send
and log locally" code, but as the only URL is the 'upload' one much of
that seemed un-necessary.
So that code has been simplified.
The ERROR log level is needlessly high for complaining about git not being installed, especially when we're running from an extracted source archive rather than a repo. Let's keep things calm and avoid polluting desktop session logs. The INFO log level is plenty.
The ERROR log level is needlessly high for complaining about git not being installed, especially when we're running from an extracted source archive rather than a repo. Let's keep things calm and avoid polluting desktop session logs. The INFO log level is plenty.
This was wrong since 291fbf2908e0f7419769a7727ee0a79cf58a9342 due to the
check saying "only active if in CQC", when it needed to be "only active
NOT in CQC, along with these other conditions".
* Session.profile() was unused (used to be called from stats.py, but
that uses cached data now).
* Session.query() was unused, other than by itself. Normal calls will
currently be via companion.Session.station(). Future CAPI queries
like `/fleetcarrier` might add their own companion.Session function.
* And in doing so get --capi-pretend-down working again.
* Small tweak to EDMarketConnector to not throw extra exception if there
was a CAPI query exception.
NB: We can't use a generator here to make a python object of the data,
to then use json.dumps() on because the raw_data is a *string* (decoded
from what we received from the CAPI service), and thus it will get
encoded as such, i.e.
"raw_data": "{\"id\":322...
when we want:
"raw_data": {"id":322...
We do not want to json.loads() that string only to then json.dumps() it
because the whole point is that this is the **raw** data to help
diagnose any issues with the CAPI service/data. Such a conversion and
back could either throw an exception we don't want here (because we want
the raw data) or possibly distort things from what was actually
received.