Monday, April 2, 2018

E-books and e-readers

It's been ages since I've posted, but I thought I'd record my recent experience with e-books and e-readers.

As context, I've recently dusted off my original Nook e-reader, and have been trying to get as much of my e-book library on to it as possible. At the same time, I've been researching new e-readers because there have been a number of technology improvements since the release of the first edition Nook.

E-books

My e-book purchases have come from a number of retailers:
Of course, the name of the game with e-books these days is DRM: some distributors apply strong DRM, some weak DRM, and some none.

Both Kindle and Nook e-books are useless except within their ecosystem.

Kindle books can be downloaded with a "Download & transfer via USB" option, but the AZW3 file is encrypted to a specific device. I see references that this wasn't always the case, but that was my experience.

Nook no longer offers a download option at all; books must be delivered via the cloud to their devices and mobile apps. I saw references to a Nook Study desktop app but couldn't locate a site to download it. I had also previously downloaded EPUB files, but Adobe Digital Editions asks for a username and "unlock code" to decrypt those. I see references online that the credentials are tied to the credit card used for the purchase (username is the name on the card, unlock code is the credit card number itself), but there's no way I have that information for books I purchased 6 years ago.

Google Play purchases have some books with DRM, but so far I've been able to open those in Adobe Digital Editions without trouble. I'll explain below, but if I can open it in ADE, I can remove the DRM and read the book on whatever device I please.

Other retailers generally don't use strong DRM. Pragmatic Programmers has a unique scheme where their books are completely DRM-free, but are generated with the purchaser's name and email address embedded in the content.

E-book library management

In this quest to organize everything, I ran across Calibre, and hands down it's the tool I always wanted for this sort of thing. It can track books across multiple retailers, filter by author, publisher, fetch metadata, import and organize vast unorganized collections of files, track one entry for multiple file formats; it's a lot like Plex for video media. Calibre will even convert between file formats, removing one of my big concerns about devices: the fact that Kindle doesn't support EPUB.

The one thing Calibre doesn't do out of the box is anything concerning DRM, but luckily there's a plugin by Apprentice Alf and Apprentice Harper that will read DRM files and when possible, remove it. Note: I paid for the content. I didn't get it from a pirate, and I'm not giving it to pirates. The author and publisher got their cut, so I feel no remorse in breaking DRM implemented by retailers trying to lock me into their ecosystem. Apprentice Alf's DeDRM Tools will remove any DRM if Adobe Digital Editions can decrypt it, but as noted earlier, I haven't found a way for ADE to access Nook or Kindle purchases, at least not yet.

New E-Reader?

In considering a new e-reader, I was at first committed to staying with Nook, primarily because of their EPUB compatibility and a bit of brand loyalty. However, Kindle's devices seem superior in terms of polish and variety of selection. With Calibre, EPUB compatibility is less of a problem since it's straightforward to convert to MOBI. And since Nook has locked down their walled garden, I'm less inclined by brand loyalty. I can look at Kobo as well, but really, Kindle's the big show.

Which retailer for the future?

After all of this research and testing, the other thing I have to decide is, given a choice, which e-book retailer should I prefer?

With e-books in particular, I feel like the freedom to move around should be a given; it seems like there should be fewer excuses for brand lock-in than any other type of media. For that reason, I'm less inclined to purchase books from Kindle or Nook. Sure, there's a bit of convenience from cloud delivery and last-read-page syncing or sharing/lending options, but it's not worth being stuck in a walled garden. However, if push comes to shove and I really want to purchase a recent, popular title, I think I prefer Kindle's current household sharing model; it's light years ahead of Nook's one-off lending feature.

In the end, I don't know if I'll stick by my guns and prefer Google Play books, or get comfortable with the walled garden and switch to Kindle books.

Update

2018-04-04: I had luck today using Kindle for PC in conjunction with DeDRM Tools, so Kindle books appear to be a reasonable purchase right now. On the other hand, DRM is a game of cat and mouse, so I'll probably still lean toward more permissive retailers (Google Play Books) when possible.

Friday, March 23, 2012

Buzzing XBees

It's been a while since I've written, but I just have to share my experiences with this. I recently bought a pair of XBee Series 2 radios along with XBee Explorer break-out boards. I first heard about these radios in general back in school when I was working with CU's RECUV group, and then again in Tom Igoe's excellent book Making Things Talk, and I've been itching to try them out ever since. They're a bit like Wi-Fi or Bluetooth, but seem to lend themselves better to microprocessor projects, generally use less power, and have some unique features that are great for sensor networks. I understand they're also being used for modern home automation products.


However, getting these things configured and tested wasn't exactly a walk in the park! As it turns out, I probably should've gotten the Series 1 radios, as they use the simpler 802.15.4 protocol. Series 2 uses the Zigbee protocol, which allows for more advanced network topologies but is also harder to configure. In the end, though, it wasn't really that difficult, and I appreciate the greater flexibility, so I'm happy with my purchase. Because of my difficulty in the first few steps, I'm taking it upon myself to write a little about my lessons learned in getting these going.


Once I soldered headers onto the Explorer boards, my next task was to hook them up to USB-to-serial adapters to program them. There are Explorer boards with built-in serial adapters, but they're more expensive, and I already have a separate adapter. However, this meant I could only hook up one radio at a time, and it turned out to be very handy to hook up both. I eventually found a blog post that indirectly guided me to using my Arduino's USB-to-serial chip as a sort of pass-through. If you hook a jumper wire from GND to the RESET pin on the Arduino board, that effectively disables the ATmega chip so it doesn't interfere with the UART pins. (You can also remove the chip from the socket, but the jumper is easier.) After that you can hook up +5V, GND, and the two signal wires from the Arduino to the Explorer. In this case, the Explorer is acting as the UART's device, so you have to hook up RX to DIN and TX to DOUT in a straight-through configuration, instead of crossed-over as in normal device-to-device serial.


In that configuration, I was able to start two copies of the X-CTU configuration program (v. 5.2.7.5 under the General Diagnostics, Utilities and MIBs tab as I write), and start setting up the radios' software. On the opening tab, PC Settings, I set one copy of the program to each COM port (4 and 6 in my case) and made sure I could Test/Query both radios. My radios came set to 9600 baud.


After that, I started poking around in the Modem Configuration tab. By clicking Read, I was able to find out that both of my radios came loaded with the Zigbee Router AT firmware. If you read the Product Manual, you'll learn that a ZigBee network must have exactly one radio with Coordinator firmware, even for simple point-to-point connections. With that in mind, I changed one of the radios to a Coordinator by selecting Zigbee Coordinator AT in the Function Set dropdown box and then writing the new firmware. On both modules, I set the same PAN ID. In Zigbee (as opposed to plain 802.15.4), the 16-bit network address is either implicitly 0 for the coordinator, or randomly set for all other nodes, so those fields aren't editable. Since I'm starting with a point-to-point network, for both modules I set the destination address (DH and DL) to the serial number (SH and SL) of the other device.


I could be forgetting something, but that should be enough to get point-to-point communication going. From the Terminal tab on both windows, I'm able to type on one side and watch the other side receive. In the photo, blue is what I typed, and red is what was received by the other radio.


Once all of that was established, I decided to hook up the radio to the Arduino "for real." To switch the Arduino from a pass-through to a microcontroller, I just removed the wire from the RESET pin, and swapped the signal wires so that I had RX to DOUT and TX to DIN (as in normal, crossed-over serial communication). After that, I loaded the Arduino with a simple program using the Serial object. The sketch just listens to serial in, and when it gets a byte, increments it and writes it back out.


As you can see, the module attached to the computer (and visible through the X-CTU window) sends a letter (in blue). The radio attached to the Arduino gets the letter and passes it to the Arduino. The Arduino increments the letter, passes it back to the attached radio, which sends the new letter to the computer's radio and back to the terminal (in red).

That's the extent of my exploration so far. Nothing too exciting, but I'm quite pleased that I'm this far. Next up, maybe I'll control a servo on one Arduino using a potentiometer on another. We'll see!

Sunday, October 9, 2011

My Robot Friend

It's been a bit since I've posted, but fear not! Things are going well, and some exciting things are taking shape!

Work is as exciting as ever. There are new challenges almost every day it seems, yet nothing that seems to be impossible. Our systems environment is ever-changing, and the causes and motivations are numerous, but it's all in a day's work. I've also taken it upon myself to start implementing measurable quality metrics in my work, and that has been an interesting journey. I'm starting to learn a number of tools and techniques for automated building, testing, and deployment, and those are skills that should serve me anywhere on any project.

Outside work, I got the itch to do some engineering mentoring, and so I got a hold of my local high school. They don't currently have any programs like the old "computer club" that I attended when I was there (it was really just 3 or 4 of us that would hang around in the computer lab after school). However, just as I contacted them, the tech lab teacher was gearing up a team for a robotics competition called BEST. Basically, over a 6 week period, the team of high schoolers must design and build a fair-size robot (maximum 2 feet cubed) out of basic provided materials, like plywood and PVC pipes, plus an electronics kit. The robot must then perform certain game tasks in competition. I've been volunteering my time with them, and I'm really having a blast, even if there isn't all that much programming. Many general engineering principles, like designing to meet goals and building prototypes, are still useful. Maybe after the competition, I can keep something going through the rest of the school year. We'll see.

Sunday, September 11, 2011

Ten Years Ago, Today

I was in the 10th grade. The morning started off as any other, with my morning bike to school. My first class was Spanish. There were rumors of confused news reports about a small plane accidentally glancing off of the World Trade Center, but our teacher would allow no distractions. Our class plowed through the planned material.

But when I arrived at 2nd period chemistry, it was clear that this was no normal day, no small news story, and no accident. Our chemistry teacher didn't say much, and didn't attempt to teach class. We listened to the radio reports from shocked and confused journalists in New York. We heard that commercial airliners had been hijacked and crashed into the towers and the Pentagon; that both towers had just fallen; that the streets of Manhattan were rivers of dust and ash; that tens of thousands were injured, trapped, or dead.

It took time for the magnitude of the reports to crystallize in my head. This had happened on American soil? The shock was profound to me, a teenager who had never heard of a disaster of this scale in my own country during my own lifetime. The next few hours are now just a blur, but I know the question eventually solidified: what was going to be attacked next? Every feeling of comfort and safety was shattered, for myself and everybody around me.

I remember sitting with friends in the school cafeteria at lunch time. Half of the students had already left school. In my recollection, there was no official release by the administration, but our teachers were also in shock, and knew that on this day, nobody was thinking of anything else. Some kids had left or been pulled out by their parents so they could be with their families. Some kids left because they were afraid that the school might be targeted, images of Columbine High School still fresh in their minds. Other kids didn't immediately grip the gravity of the event, saw the reign of chaos in the school, and left because they smelled the opportunity for a day of freedom.

When I got home, I sat in a daze in front of the TV news reports. The hijackers had been foreign terrorists. My train of thought was, "Really? Who would want to do that to us? We're Americans! We have our faults, but no one actually hates us, do they?" There had been four planes, and one had mysteriously gone down in a field in Pennsylvania; there were hazy reports that the passengers might have fought the hijackers and forced the plane down. The whole thing was entirely surreal.

The end of the day seemed drained of hope, but over the next weeks and months, I saw the country come together in strength and hope. Out of the wreckage, miraculous stories of heroism, courage and compassion emerged. For every mention of the innocent lives lost, there were tales of the brave people that selflessly gave for their fellow human beings. That day touched the life of every American, and I like to think that we've become better for it, as a nation and a people.

Sunday, July 10, 2011

Google Map Maker

For those that don't know, Google Maps just opened up the Google Map Maker web application to allow normal users to enter data into Google Maps. After a review process, those edits can be published into the public Google Maps for the world to see. I was pretty excited when I learned about it in April, and I've been pretty active.

Recently I received an e-mail from the Google Map Maker Community Team. Because I'm "one of the top mappers in the United States," they offered to send me a free t-shirt, and invited me to the Google Geo User Summit in Mountain View. I had to regretfully decline the offer to attend the summit, but the t-shirt is pretty awesome.

From July 2011

My friends and I were trying to figure out what the map on the front is showing. At first glance, it looks like a map of the world's lights at night, but some major countries, like the US, Canada, UK, France, and Russia are not lit up. So what do the white areas represent? My speculation is that those represent major road systems that Map Maker users have added which were not previously recorded in Google Maps, but I'm not sure. Does anyone know?

Friday, July 8, 2011

Graduation Videos

Check these out! Courtesy of Munchkin, thanks friend!



Sunday, July 3, 2011

Exploring Google App Engine with Restlet and Objectify

Warning: this post is excessively geeky for my usual audience. Non-programmers, feel free to skip.

I've been doing a little more exploration with Google App Engine. I've used it before, first for my Weasley Clock, and also for one of my classes, Object-Oriented Analysis and Design. In that OOAD class, my team and I had a short amount of time to build an Android app backed by a RESTful web service on GAE. We ended up using JAX-RS/Jersey as a RESTful framework, JAX-B for server-side XML serialization, and JDO for the datastore API. Then on the Android client, we used the Apache HTTP client and SAX parsers. That worked out alright, but recently I've wanted to explore what other options are out there.

I've been interested in Restlet for a while; in OOAD, I explored that first before moving to JAX-RS. It seems to be a pretty comprehensive solution. They provide both server and client libraries for many platforms, with flexible configuration for many needs, such as filters. What turned me off last time, and almost again this time, was that the source code for the "first application" example didn't work out of the box. I had to decide which version of the library I needed, download four different editions, and figure out which .jars to put where. Then for a long while, I was stuck on an interesting bug. Apparently, GAE doesn't support chunked encoding (whatever that means), so their current stable 2.0.8 libraries won't work for GAE/Android communication. Their 2.1m5 libraries have a workaround, but one must manually specify that entities must be buffered (once again, what?). I also paired their 2.0 "first application" example with 2.1m5 libraries without realizing that there was a difference, because their wiki wasn't clear about which version of the sample application you were looking at.

So far, I would say that Restlet is a well-built library for RESTful communication, both server and client-side, but their documentation and examples could really use some organization help. The answers are on their site, but sometimes they seem to be hidden away in dark corners of the wiki.

After I got the Restlet "first application" working, I wanted to bolt on datastore support. I wanted to try something other than JDO, and so looked at JPA. However, these both seem to be cumbersome and ill-fit when it comes to the GAE datastore. Then somehow I stumbled across Objectify, a relatively thin library that works on top of the datastore low-level API. Some of the attractive features were a less-cranky Key interface, using POJOs instead of GAE-specific Entities, and GWT compatibility without DTO's. I did have trouble with that GWT part for a while, because I thought that my POJOs still needed to be annotated with java.persistence.Entity. I think that was causing problems when the DataNucleus Enhancer went and did its magic; the server-side class looked different than what GWT was expecting on the client side, which ended up throwing a SerializationException. However, I finally figured out that stored entities don't need the Entity annotation. The only thing that the POJOs need is a java.persistence.Id annotation on the primary key field.

I have yet to really explore this thoroughly, but so far both libraries seem to be doing their job and playing nicely. The best part was not having to write DTOs! I did have to fight with both libraries for a while, but hopefully this post will help somebody else avoid those pitfalls.