User Preferences Take the Lead
In a January issue of Dr. Dobbâs Journal, Andrew Binstock enlightens readers to his thoughts on the most profound change that has occurred in the way software is built. Weâve reversed the thought process from a development focused one to a process that gives the user the center of attention.
Initially, this reversal in focus may have been challenging for development, but as more successful applications were born from the new paradigm, the more the value of sticking to the new order was validated. Today, user needs and preferences automaticallyÂ take priority over developmental and even budgetary preferences. Enough man hours are focused on the user experience that dedicated positions exist with the sole purpose of making sure that the user experience is kept centric to development. UX (User Experience)Â Designers have tremendous influence in planning, designing, developing, prototyping, testing and training â every aspect of theÂ process.
Exactly what is âUser Experienceâ?
User experience (UX) is the way a person feels about using a product, system or service. User experience highlights the experiential, affective, meaningful and valuable aspects of human-computer interaction and product ownership, but it also includes a personâs perceptions of the practical aspects such as utility, ease of use and efficiency of the system. User experience is subjective because it is about an individualâs feelings and thoughts about the system. User experience is also dynamic, because it changes over time as the circumstances change. (from wikipedia)
Because its experts understanding the importance of user experience in custom application development, INSite Business Solutions has added these skills to its development team credentials. INSiteâs complex, data-intensive, web-based applicationsÂ Â as well as its robust ecommerce storefronts require depth in the areas ofÂ development and user experience. In fact, the needÂ for the user to be able to easily and effectively get the needed results from the application is just as important to the process asÂ producing clean code and elegantly integrating data and processes into functionality.
INSite also leverages agile development and project managementÂ to make sure they follow-through onÂ initial user-centric plans and produce applications that delight end-users.
Read Mr. Binstockâs âItâs the Userâ – provided below – to learn more.
From Dr. Dobbâs Journal:
It’s the User
By Andrew Binstock, January 13, 2012
Historically speaking, the most profound change in the way we build software today is in how we care for the user.
Perhaps because Dr. Dobb’s has been around since 1975, I frequently get letters from readers to the effect that they’ve been programming 20|30|40 years and a current practice mentioned in an article is no different than what was being done in programming decades ago. I rarely publish these curmudgeonly dismissals of current practice because, in my view, they are almost always significant distortions of a superficial similarity. The most common attack I hear is that some Agile practice is nothing new, “we were doing that in the 1980s.” That is almost always a fallacy. I hesitate only over the qualifier “always.”
Programmers who worked in the 1980s know that even the most basic best practices and tools needed for Agile did not exist then. For example, most sites had no formal version control system. Version control was generally done by saving files using names containing dates, so you could go back to an earlier version on the basis of date (rather than what was changed). Once you were assured the current code more or less worked, you were encouraged to get rid of the date-marked intermediate versions. There was no pervasive ability to attribute code changes to a specific person because there was no way to track who had done what. Likewise, build management was not viewed as a discipline unto itself and there were few tools that enabled you to recreate an earlier version of a product because few shops kept track of what versions of components went into what release.
The concept of short sprints also did not exist at most sites. Actually, sprints of any kind were unknown. And modern practices such as continuous integration did not come into the common experience of developers until some twenty years later.
But perhaps the most telling difference between then and now was the view of the user. Up through the beginning of this century, the general IT view of the user was a sort of indulgent condescension. Users were seen as a necessary evil. They were certainly not people you’d encourage to participate in regular product reviews and provide constant feedback.
Without the practices of regular user feedback, short development cycles, continuous integration, and tools like version control, there was no ability to respond quickly to changing requirements. Consequently, there was no agility. Things that look like or smell like agility in such an environment are merely coincidental in appearance. However, even as far back as 1968, it was becoming clear to the folks who had eyes to see that the aforementioned model (despite its persistence as a valid approach for decades more) was not workable and in need of change. Here are some quotations on point taken from a report from a 1968 NATO software engineering conference (note that the full report is a 136-page PDF that loads slowly.)
“We are starting gradually, and building up. My motto is âdo something small, useful, now.’ Large systems must evolve, and cannot be produced all at one time. You must have an initial small core system that works really well.”
And later technology approaches were anticipated: “System testing should be automated as well. A collection of executable programs should be produced and maintained to exercise all parts of the system…As an output of a test validation run, each test should list the modules it has exercised, and as well, should list the interfaces and tables it has tested. It is important to document success, as well as failure.”
The report itself communicates a gathering dissatisfaction with the waterfall model of that era, which was highly focused on requirements gathering, design, and implementation as separate and distinct stages. There was also great concern that as program size grew, this approach would become increasingly unmanageable. The concern was primarily expressed in terms of cost and schedule, rather than anything else. Satisfying the user was considered a given via the requirements stage. So there was no strong feeling about the need for a methodology that enabled changes to occur during the development phase. The closest anyone comes (in my scan of the document) is this statement: “Users are interested in systems requirements and buy systems in that way. But that implies that they are able to say what they want. Most of the users aren’t able to. We should have feedback from users early in the design process.” Not exactly a full embrace, is it?
We can see that the idea of constant user involvement had not yet emerged even among leading adopters. And there lies the crucial factor. Today, we operate profoundly differently, not just because we have better technology (although certainly that’s a major contributor) and better development practices, but because we have a user-centered approach to development. In that sense, modern practices are different, in fact fundamentally different, from those of previous generations.
Original article: http://www.drdobbs.com/architecture-and-design/its-the-user/232300570
Andrew Binstock is Editor-in-Chief of Dr. Dobbs Journal.