I've been doing bookkeeping in my family for a few years using a variety of tools. Unfortunatelly, neither of them had a combination of features I wanted, which is: showing actual vs. planned expenses by categories; automated data import from banks and shops websites which don't have API; integration of data from shops and banks (for example, I want to see that $100 from credit card was paid for an online order, which consists of $10 T-shirt, $30 video game, $50 microwave oven, $5 shipping expenses and $5 taxes). I resorted to writing such a tool myself, and now I'm happy to share it with you all.
I'd love to share my real bookkeeping report with you, but I cannot because of privacy concerns. Fortunately, my imaginary friends Phillip and Leela kindly aggreed to share theirs. You can check it out right here.
This example is included in the project homepage build automation. Feel free to explore and play around with it.
If you want to get in touch, please feel free to drop me a line.
It runs continuously as a daemon, scrapes data from banks and shops websites, transforms it into a single consistent bookkeeping journal and generates the report.
Shows history of actual vs. planned transactions split by categories. Individual ordered items, as well as bank transactions, can be categorized.
Uses Weboob for scraping. It's open-source: you can create new modules for the websites you're using, or fix existing ones when websites get updated.
Integrates data from shops and banks websites into a single consistent bookkeeping journal. For example, here's how transactions group generated from online order looks like in the report.
User adjusts transformation of scraped data into bookkeeping journal by writing a bunch of callbacks in Haskell. Here's how Phillip and Leela's report looks like without custom adjustments. And here's the same report with custom adjustments.
Validates consistency of the bookkeeping journal. If consistency has been validated, then all account balances add up to the ones scraped from banks; all expenses, incomes and assets add up to zero; and neither of the categories intersects with another.
Can merge old archived scraped data and fresh data just retrieved from the website. Comes in handy when a website suddenly desides to dispose of old user data.
Runs on Arch Linux, but with a reasonable amount of effort can be built on Windows, Mac OS X and many Unix-like operating systems.
There's no support for multiple currencies, although it shouldn't be difficult to add. I just didn't have a use case for it. If you do, contributions are welcome.
Copyright 2015 Oleg Plakhotniuk