Slingshot On Target

When I last left you, I had moved a scratch copy of my application to a shiny new Joyent Accelerator, brought it current with Rails, and was preparing to migrate new created_at and updated_at columns to my database.

All that went well. Several of my models already had created_on (date) fields, but Slingshot needed more detail. Those got converted over to full datetimes. I also inserted raw SQL into my migration to quickly set all of the new created_at and updated_at to the current date/time.

The Slingshot synchronization process has two components: a plug-in/generator that adds a new controller and several methods to the “live” application and a somewhat complicated rake task that resides on the client side.

In the new controller you build an array of arrays that contain the data that is able to be brought up and down, using any rules and logic you need. For my first tests, I went ahead and included almost everything. In future work, it’ll have to be much more complicated, as my application has a three-tiered authentication scheme. First, each registered subdomain acts as a standalone instance of my application (though in truth they’re all served up by the same mongrel processes). Second, each subdomain has a set of users. Third, each user has a set of roles. There is a mechanism for passing authentication information from the client to the server, but in the interest of just getting things working I only bothered to check subdomain.

The controller (named in my routes.rb “sync”) has three main actions: “down”, “up”, and “log”. The down action builds the array of arrays of allowed data and the serves up an XML file containing the data as well as a number of meta data. The up action…. well, I actually haven’t yet gotten that far. The log action records the successful imports and exports, so it knows where to pick up next time.

The rake script does all of the client side work. It calls sync/down from the server, receives the XML data and saves it to the local /log directory, parses it, and performs the necessary deletes, updates, and inserts to the local sqllite database.

It took me a *long* time to get that part to work properly. First, it was silently failing and I didn’t know enough to be watching OS X’s terminal log to see what was going on. I actually opened it up by accident and found the error messages I had been looking for. The first few were related to errors in the documentation (then very meager but now much better as Joyent prepares for a general release), but I was able to get past those in pretty short order.

The rest were my fault. The data transfers are not raw SQL. They create instances of your Models and use all of the regular model methods, including validations. First, the instances were failing because I did not include all of the gems required by my application into Slingshot’s local VM (more on that in another post). Once I took care of that, then they were failing on the save validations. I have a number of custom validations that check for things such as uniqueness within a subdomain, and as I mentioned above, I was skipping most of my authentication for this test. I got past this by modifying the rake task to use save(false) (and thus bypassing validations). I think it’s safe to say the data coming from the live app has already been validated, so I was fine with that change. Then, saves were failing because of database integrity constraints. You see, when my real app first went live, I allowed null values for several fields that I had since switched to not. The old null values were still in the database, though, so when the imported data was saved, sqlite rejected them. I fixed that by changing the nulls to “Unknowns” and everything was fine.

So that’s where I stand. Slingshot is able to pull data down and I can interact with that data locally. Next up, sending the changes back up. And then, adding the real authentication.

Leave a comment

Your email address will not be published. Required fields are marked *