So the big difference between then and now is that I bought a Samsung Galaxy S2 and wow does that purchase ever change things. The SGS2 makes almost every device on the market that isn’t a variant of the SGS2 look bad and this handedly includes the XOOM. Other than some very tablet-specific apps (like those that take advantage of fragments) and for applications where resolution changes everything, there’s almost nothing that the SGS2 can’t do better than my venerable “old” XOOM.
Showing posts with label IT. Show all posts
Showing posts with label IT. Show all posts
Saturday, 17 December 2011
Wednesday, 24 August 2011
Android Diaries, Part 4: Fixing the SGS2
The Samsung Galaxy S II is a wonderful device out of the box but the one thing that irks me is the TouchWiz UI skin. The plain simple Android experience is damn near perfect, I don’t understand why manufacturers need to add extra bloat and crap (MotoBlur, Sense, TouchWiz, etc.) to the user experience. This guide documents my attempts to get rid of all the excess crap.
Tags:
Android,
IT,
Series,
TechSupport
Samsung Galaxy S II Review (9.5/10)
Let’s just get this out of the way: this phone is about as perfect as it gets for now. There are only a few places things Samsung could have improved on (although I suspect by the time the next round of super phones come out, these issues will all be resolved). Until the next round of super phones comes out, it’s very difficult to see past the sheer awesome that is the Samsung Galaxy S II.
Motorola XOOM: A second look…
Suffice it to say, I love my tablet. Sure it’s got tons of faults and it’s a very niche product, but I definitely love it. I’m not sure, however, if I would buy it again, now that I’ve had the chance to use it for a few months. Back when I got the XOOM, it was the premier Android tablet on the market (and the only Honeycomb one to date); now, the market has several other very worthy contenders and I’m not so sure I would go the XOOM route again.
Saturday, 21 May 2011
Motorola XOOM WiFi Review (9/10)
So I got myself a tablet. There wasn’t a lot to choose from: the Apple iPad/iPad 2, Blackberry Playbook, Motorola XOOM WiFi and the Galaxy Tab. I think there may have been a few more but they didn’t catch my eye. Mind you, almost immediately after making my purchase, the ASUS Eee Transformer, Acer Iconia and new revised Galaxy Tab were announced/available. At the time the decision was down to iPad vs the XOOM. I ended up with the XOOM (the WiFi only version because [a] that’s what I want and [b] it’s the only model available in Canada at this time and [c] it’s expensive enough as is)
Android Diaries, Part 3: Gingerbread
I’ve been pretty happy with my cleaned up Milestone and I was content with running a rooted and slightly modded Android 2.2.1. Then I stumbled upon an article outlining a serious data vulnerability affecting Android versions up to and including 2.3.3. This gave me the final push I needed to take the next step with my phone.
Wednesday, 3 November 2010
Logitech G500 Review (9/10)
I've had a host of Logitech mice (and Logitech products in general) over the last long time. My recent history of fulltime mice has been: Logitech Elite Mouse, MX1000, MX510, MX518, G5 v1, G5 v2 and currently the G500. While I'm very satisfied with this mouse (for now), I can look back on my history of mice and tell you the Logitech design team needs a solid smack on the rear-end... what?
Tuesday, 2 November 2010
Running Series: Little Tweaks, Tips and Apps (Part 3)
I have had the luxury of working with multiple monitors for, well.... forever. The OS support for multiple monitors has come a long way over the last 15 years and even non-geeks can be found running dual-monitors these days. Regardless of whether your background is in graphics design, software development, day-trading, marketing there is a tick about a power user when they are setup with multiple monitors. Their secret is in their stash of shortcuts, tools and hacks. Keep in mind that although the focus for many of these tools is for multi-monitor setups, some of them apply even to single monitor users!
Sunday, 14 March 2010
Running Series: Little Tweaks, Tips and Apps (Part 1)
Wednesday, 27 January 2010
Running Series: Win 7/Vista Aggravations (Part 2)
Tags:
IT,
Series,
TechSupport,
Win7
Monday, 11 January 2010
Running Series: Win 7/Vista Aggravations (Part 1)
Tags:
IT,
Series,
TechSupport,
Win7
Tuesday, 22 September 2009
I hate programming
Ok so that's a bit of a lie. I don't hate programming. I just hate web-programming. Or maybe it's me hating reality, that might be it. Ok, well when we learned how to design software, we were all (or should have been) taught a series of core principles. It's these principles that separate 'our' clean, efficient, meticulous code from that of a hack (who may or may not be brilliant in and of her/her own right -- just not capable of working on a team etc). Whatever. One particular principle that stands out to me relates to hardcoding.
Don't hardcode. Sure there are cases where hardcoding is acceptable/ideal but for the most part (i.e. the majority of web and client applications), there's not a whole lot of need for hardcoding and in fact, hardcoding can get you into a world of pain down the road. Sure, you might think you just finished coding whatever it was you were doing and it's perfect -- until someone comes along and says "hey, can you change to ? thanks!" -- that's when you realize you were dumb and you now have to do do a search/replace-all. Naturally it gets more complicated/tedious if it's not longer a case of search/replace-all. So yeah, hardcoding is generally, non-ideal (besides, it's far more elegant to have a universal algorithm that handles a wide range of scenarios than to be stuck in the box of only one scenario).
Don't hardcode. Sure there are cases where hardcoding is acceptable/ideal but for the most part (i.e. the majority of web and client applications), there's not a whole lot of need for hardcoding and in fact, hardcoding can get you into a world of pain down the road. Sure, you might think you just finished coding whatever it was you were doing and it's perfect -- until someone comes along and says "hey, can you change to ? thanks!" -- that's when you realize you were dumb and you now have to do do a search/replace-all. Naturally it gets more complicated/tedious if it's not longer a case of search/replace-all. So yeah, hardcoding is generally, non-ideal (besides, it's far more elegant to have a universal algorithm that handles a wide range of scenarios than to be stuck in the box of only one scenario).
Tags:
Fail,
IT,
Programming,
Rant
Thursday, 10 September 2009
Trillian Automation + Too much time on my hands...
Preface
While the majority of my friends and colleagues have reasonably reliable internet, there are a few that, for whatever reason, have a hard time maintaining their connections. While most of them know that their connections are unreliable at best (either due to their ISP, bandwidth usage, equipment failure or that they are wardriving) a few have no idea that their connections are stuttering.
A connection is said to be stuttering when the connection drops but for such a short period that, based on how they use their connection, there is no glaring indication of what just happened (I imagine most users don't run perpetual ping-tests). Microsoft's 'Live Messenger' (and the various other chatting services) is a major culprit here - particularly so with the advent of offline messaging.
MSN (and other chatting services) operate by having a user sign on. This sets the client's state as being 'connected' (this is independant from the user's actual status i.e., available, away, invisible etc). The client checks back with this central messaging server every so often to make sure that the user is still connected. With the official MSN client, (empirically), it looks like this check is made every eight seconds or so. This is just when the client is sitting there, doing nothing. During active conversations, I would estimate the check to be four seconds or so.
When the client software [successfully] checks in with the central server, the software is "reassured" that it is still connected. The software (and thus, the user) is only made aware of a problem in the event that this check fails. What this means, is that if the user's connection drops and is subsequently restored during the 7.9 seconds window between checks (3.9 seconds for active chats), the user has no idea that they were, in fact, offline for a very small window. From the end user's (the one with the dodgy connection), there is no reason to care -- they aren't aware of the disconnect and thanks to the luxury of offline messages, most of the time, they don't miss anything from the conversation.
It is important to note that offline messaging is not bullet proof. I don't quite know how the various offline messaging system work but I from time to time, we've all recieved the "such and such message failed to be delivered" error message even when both users in the conversation might be using client software that supports offline messaging.
While the majority of my friends and colleagues have reasonably reliable internet, there are a few that, for whatever reason, have a hard time maintaining their connections. While most of them know that their connections are unreliable at best (either due to their ISP, bandwidth usage, equipment failure or that they are wardriving) a few have no idea that their connections are stuttering.
A connection is said to be stuttering when the connection drops but for such a short period that, based on how they use their connection, there is no glaring indication of what just happened (I imagine most users don't run perpetual ping-tests). Microsoft's 'Live Messenger' (and the various other chatting services) is a major culprit here - particularly so with the advent of offline messaging.
MSN (and other chatting services) operate by having a user sign on. This sets the client's state as being 'connected' (this is independant from the user's actual status i.e., available, away, invisible etc). The client checks back with this central messaging server every so often to make sure that the user is still connected. With the official MSN client, (empirically), it looks like this check is made every eight seconds or so. This is just when the client is sitting there, doing nothing. During active conversations, I would estimate the check to be four seconds or so.
When the client software [successfully] checks in with the central server, the software is "reassured" that it is still connected. The software (and thus, the user) is only made aware of a problem in the event that this check fails. What this means, is that if the user's connection drops and is subsequently restored during the 7.9 seconds window between checks (3.9 seconds for active chats), the user has no idea that they were, in fact, offline for a very small window. From the end user's (the one with the dodgy connection), there is no reason to care -- they aren't aware of the disconnect and thanks to the luxury of offline messages, most of the time, they don't miss anything from the conversation.
It is important to note that offline messaging is not bullet proof. I don't quite know how the various offline messaging system work but I from time to time, we've all recieved the "such and such message failed to be delivered" error message even when both users in the conversation might be using client software that supports offline messaging.
Tags:
IT,
Programming,
Util
Subscribe to:
Posts (Atom)