Tuesday, December 27, 2011

Traffic towards Creating IT Roadmaps

There has been surprising amount of traffic on my creating information technology roadmaps post from a few months back. This could be due to the time of year... maybe people are preparing for the new year and want to get a sense of where they are going. If you are interested in creating information technology roadmaps, this is how I see it done. Keep in mind roadmapping is an ongoing work, and so far I have written four posts on the subject;
  1. Getting Started - how to start the process of creating an IT roadmap
  2. Gathering Data - thoughts on gathering data for the roadmap
  3. Technology Trends - how I currently see technology trends
  4. Pedagogical Trends - how I currently see pedagogical trends

Sunday, December 25, 2011

Happy Christmas to all, and to all a good night!

Merry Christmas to all my clients, business associates and blog followers, who I also consider my friends within this amazing global village we live and work. As many of you know I am currently on a nine week leave traveling Thailand and learning the Thai language with my Family. This is a very special time that will intermittently continue as my youngest son was born in Thailand. I want to send thanks to all of you;
  • To my clients for providing amazing opportunities to use my skills and knowledge and to grow as a professional.
  • To my business associates for the support and wisdom you provide when I struggle and have success.
  • To my blog followers, for you motivate me to keep posting and to explore my profession more deeply.

Christmas 2011 at Baan Rai Tin Thai Ngarm, Mae Rim, Thailand.


Thank-you all! I look forward to returning to Vancouver in the new year and to continue working and communicating with you all. Have a very Merry Christmas and a Happy New Year. May this year bring you much success and good fortune.

Monday, December 19, 2011

DELL Inspiron 6000 is an Ubuntu workhorse

Four years back I purchased a new DELL Studio to replace my old DELL Inspiron 6000. At that time I formatted the drive in the Inspiron and installed Ubuntu 6.10 (Edgy Eft). And now five years later the Inspiron 6000 is more of a workhorse than the new DELL Studio. It is running Ubuntu 10.10 (Maverick Meerkat) and has been used as a development workstation running apache, mysql and php to develop RESTful applications. It has hosted lucene and solr for architectural learning. It has done a whole plethora of technology tasks. It has traveled with me for years and now it is on the road in Thailand used as a rogue for blog posting and uploading images. Its a workhorse and has yet to let me down. Given I have another few weeks on the road I hope I haven't jinxed this with this post... only time will tell.

So why the DELL Inspiron 6000 over taking the DELL Mini 9 or the DELL studio.
  1. The inspiron is running Ubuntu and I figured it would be easier to fix when on the road than a Microsoft OS.
  2. Even though the DELL Mini is my personal road warrior machine, the wife and kids don't like the small screen or keyboard. And we wanted to be able to watch DVDs...
  3. The DELL Studio is running Vista... enough said.
  4. If the laptop was lost, broken or otherwise, no great loss it is over eight years old.
I really don't want to hurt this old laptops feelings. It is by far my favorite machine ever! It has written more great code than any other machine I have ever worked on. It has generated more content than any other machine. It has generated the most revenue. It has always worked for me. No longer having this laptop would be a great loss! It would be like losing an old friend.

Sunday, December 18, 2011

Volunteer work from home

A while back when I was cutting my teeth in the ICT4D world, I attended a symposium that was one of the more significant and person forming events of my adult professional life. Yes big words, but I reflect upon the days I spent at Royal Holloway with fondness knowing it influenced the direction of my life. Many thanks to the ICT4D people who put so much energy into creating the event! Tim Unwin is an exceptional person and academic who would still be my preferred mentor if I ever undertake a PhD.

During this time I read a "paper" written by Tim Unwin in July 2004 titled "Doing development research 'at home'". For me, the point of his paper is there is an amazing amount of volunteer and development work you can do from home. I also find that since this paper was written in 2004 a lot more tools have become available on the Internet to assist in doing volunteer work. From a philosophical perspective I also deeply agree with doing volunteer work from home;
  1. It's reduces travel and is therefore good for the environment.
  2. Staying close to home also focus your work on your local communities needs.
  3. It is more based on attraction rather than promotion in that the people who want your assistance will 'virtually' come to you.
This is what I see important to my practice of doing international work from home;
  • Working on things I am really passionate about
  • Publish all my work and materials for free using the appropriate licensing scheme. With faith that someone somewhere will find the work useful.
  • Offer my expertise in Communities of Practice and if people make comment or want further information about my works, engage and share expertise.
  • Engage, engage, engage... it is an amazing and growing community of learners online. All learners, regardless of stage of learning, require assistance. Its iterative and amazing what you will learn from others, even in topics you believe yourself an expert.

Thursday, December 15, 2011

Map of The Problematique

Back in March my daughter was interviewed on CJSF 90.1 FM. She did a magnificent job and has an amazing radio voice.

Recently she has been deepening her music studies with Harmony House Music Training and Performance Centre and to finish the fall session she spent time in the studio recording with some professional musicians. This is the result... Ana Rose was laying down the drum track.

Map of The Problematique by Ana Rose Walkey

Wednesday, December 14, 2011

100 posts

IMG_2084 I started 2011 with the goal of 100 blog posts. I have accomplished this goal with close to 90 posts within this critical technology blog and a further 40 posts within my Thailand travel blog. I started this 100 post journey due to my renewed belief that blogging is one of the key online technologies that assist in life long learning. In brief, it is about exploring an idea (in writing) while researching, reflecting and getting input from others on the ideas. All adult learners should be blogging all the time. It deepens learning!

What lessons did 100 posts provide?
I ended up exploring a group of subjects really deeply, and for me they spanned a number of related subjects.
  • Homebases and outposts - a look at the relationship between social media and your organizations website.
  • Networked and Open PhD #nophd - working towards a PhD from outside the institutions.
  • Cloud computing - a technical look with accompanying implementations toward establishing a cloud presence for your organizations.
  • Open Educational Resources (OER) - frequent musings, discussion prompted posts and research regarding the increasing amount of OER.
  • Pedagogical approaches - Mostly focused on adult learning and inquiry based approaches.
  • Inspired Learner Series - inspired adult learners are everywhere... and how they learn and support their learning inspires me.
  • Resist Copyright - we need to push the boundaries of fair-dealing / fair-use within the learning context! We need more case law for this, if we don't use fairness we may lose it.
  • Director of IT - the role of CTO and Director of IT is becoming increasingly important. Finding good references toward the responsibilities of these roles is equally important.
  • Mastery of Music - I started to deepen my learning of folk music through learning an instrument. This will be a long and importnant journey to my life. I hope the documenting of this journey serves as an example.
  • Book Reviews - I read books, some I will write reviews. Writing a review deepens my understanding of its content. And provides others an insight into these books.
  • MVC and 3-tier architecture - this series of posts is me getting technical and sharing my experience about good software architecture.
The subjects of posts can emerge from nowhere
I found it interesting how the subject matter of a post or a series of posts would come out of nowhere. Just an idea, a conversation or reading someone else's post, comment or tweet. And in some situations they could become an in-depth investigation of a subject.

All posts should be started, some will atrophe
Any idea for a post can be a good idea, or maybe not. I felt it was important to capture all ideas, do a little work on them and through time they would either become a full post or atrophe and get deleted as a "candidate" post.

Quirky fun can keep it lively
Keeping a blog lively for yourself and others keeps readers returning and keeps you engaged in writing. I found the occasional quirky post rejuvenated my desire to write.

Posts may be small and unrelated
Like the quirky posts I also found it necessary to post for the sake of posting. Sometimes a simple idea or fleeting thought became a short post. And the short post became a longer post... which then became a series of posts. My post on Personal Learning Ecologies has become just this... no idea is a bad idea, until it has atrophed and fallen away.

Feedback comes from many sources
One thing I have found is that to have people comment on blogs is not as frequent as it was in the past. Feedback and contribution can come from GooglePlus, Facebook, Twitter, LinkedIn, email and yes, even a face-to-face conversation. Stay aware of the many social media where commenting and feedback can occur. I often made reference to new posts on all of these different social media. It really is the feedback you are after, for it is the guidance and prompting that assists in your deepening of knowledge.


And yes, I will try and write 100 blog posts in 2012...

Sunday, November 27, 2011

Vacation Schooling

My family is currently on a well deserved two month vacation in Thailand. And during this vacation we have committed to "home" schooling the kids. I'd rather call what we are doing as "vacation" schooling. As the environment provided by being in another country, with another language and different culture provides many opportunities to take our children's regular curriculum and adjust it to our surroundings. The benefits and approaches available as we vacation school our children I see as follows;
  1. The benefit of taking the classroom outside
  2. Utilizing the many different K1 workbooks available at a different pace than a classroom with 24 other children
  3. Learning vicariously through the kids (traveling with kids opens doors otherwise not even available)
  4. Applying the lessons in both English and the local language (particularly, counting, math and polite social interactions)
  5. The ability to be more physical (particularly in having two active boys)
IMG_1896
Lucas stacking chairs in Chiang Mai in both English and Thai.



related thoughts on global socio-political-economics
Our world is in the midsts of a global power and economic shift. I have no doubt about this and I often ask myself the best way to prepare my children for a world where the original G7 no longer hold the power and the money. And other countries (China, India and southeast Asia) are the future of the global economy. I often wonder how my North American children will compete with a billion strong reasonably well-educated multi-lingual workforce born out of the developing world. Well... if they can speak a language or two and understand the cultures from these regions, they may do fairly well. Time will tell...

Sunday, November 20, 2011

Working towards finished

As usual, I have been involved with shipping software, some was for a start-up (which I really can't talk too much about) and the other was a fairly complicated work-order to fix an invoicing / e-commerce system. In particular, during the last week I got involved in some conversations about being finished and I had this simple realization;
When developing online software you will get to finished faster if you ship whenever it works. It's about cognitive load...
So what do I mean by this?

Shipping software has a lot of details. And many of these details have a good number of interdependencies. Making sure all the details have been thought through as the team nears shipping the software takes work. The best way to reduce the number of details is to resolve them and put them away. In other words work towards reducing the complexity of software features you are shipping all at one time. This is done by grouping the features into sets. And shipping the sets when they are working and tested. This creates an approach where you frequently ship working sets toward a "finished" product. And each working set provides enough features to engage the users. Always aim to ship the least number of features frequently and work on soliciting feedback from the users. Feedback can come from a number of sources including; analyzing traffic data or direct engagement with the users. In the end shipping ten 20 feature sets would occur in less time with less effort than one 200 feature release. And often feedback received from the customer alters the feature set for the coming releases, improving the product.

For more information on this type of approach to software development begin reading about agile and lean approaches.

Thursday, November 17, 2011

Engagement, Language Learning and Analytics

My family and I are traveling in Thailand as a part of our commitment to raising our adopted son Kai. Over the last two days we traveled from Vancouver to Chiang Mai in northern Thailand. One of the main goals for this trip is to begin developing our Thai language abilities. Being immersed has reminded me that language learning is really hard and best done when you engage the learning content often and at regular small intervals.

1stNight

Something that happened yesterday was the random meeting on the pool deck with a fellow named Peter Lutes, a lecturer with Kagawa University. He is in the process of setting up a dual degree program between Chiang Mai. Thailand and Kagawa, Japan. An interesting part of our conversation was regarding Learning Analytics and their growing importance toward blended and online learning.

All this got me thinking about Learning Analytics and the growth of this relatively new idea within education. In simple terms this idea is how to use ALL the meta-data that can be "harvested" from a learners online activities to improve the online learning experience, deepen learning and encourage completion. A small while back David Wiley put together an excellent post about applied learning analytics with great use of a google chart gadget.


To see this interactive visualization (play with it here). In the end I was thinking about how this applies to my current language learning task? What I took from David Wiley's post is that frequent and meaningful engagement with the learning content assists greatly with achieving results. And the neat thing about learning analytics is that this can be measured from the beginning and all through a course in great detail. In particular, with online courses all this detail data is available from log files and other data capture embedded in the software used during the online learning experience. If teachers can closely monitor the engagement they can intervene sooner so students are encouraged to engage and therefore achieve better results.

So... what does this have to do with language learning? Engage often, track my engagements, use a variety of approaches and don't stray from being disciplined in my practice.

Wednesday, November 09, 2011

Creating IT Roadmaps, Technology Trends


If you have stumbled upon my series of posts on technology roadmaps and you have been looking at the associated data you may be wondering where I have come up with all the data and labels for the trend graphs. Described here is how I have derived all this information for the Technology Trends Graph. Every line on the graph shows saturation within the overall trends being analyzed. In this technology graph, small-devices will continue to be adopted, internet platform will continue as the preferred platform for deployment (though it will plateau), Cloud computing will continue to be increasingly utilized and software broaden to meet more needs.  The lines are there to identify trends and to identify events. And yes, there are many other event that could go on all of these lines... feel free to email peter@rawsthorne.org other important events.


Small Devices
It is predicted that there will be more small devices than there are people by 2020. Even if this prediction is false there is no denying that mobile / small devices will have an impact on how we communicate and work over the next decade.
  • Palm / Newton - the early stages of small devices needs to be credited to the Palm and Newton devices. Even though neither of these became main stream the Palm gained the most consumer acceptance. Both of these products began the commercialization of small devices.
  • 7x24 - with the small device also came an expansion of the "working day" this isn't to say it created the longer working day it has created the opportunity for people to work anywhere, anytime.
  • Blackberry / Nokia - it could be argued that the blackberry and nokia were the first two small device companies to gain global commercialization. Blackberry for its text messaging / pager solution and nokia for its internationalization.
  • Smartphone - The smartphone is significant as a small device for its usability and ability to make mobile applications across many features (web, readers, phone, email, messaging, photography, etc). Before the smartphone there was no single device that could provide usability, access and platforms for mobile application development.
  • iPhone - the iPhone is a smartphone. The significance of the iPhone is it set a new bar for usability and therefore made the smartphone accessible / usable for the general public.
  • Android - the android phone made the smartphone opensource. And disconnected the operating system from the hardware so any manufacturer could produce a smartphone. This opensource approach is set against the proprietary approach of the apple iPhone. The adoption rates for android have been steadily increasing where they are passing the iPhone.
  • iPad - the iPad is doing to the tablet market what the iPhone did to the smartphone market. It innovated the UX so the product has dominated the market. This does not mean other products will not follow from other vendors that will take market share (as the android is now doing to the iPhone). And as would be expected Apple has created, or encouraged the creation of, many great learning and knowledge applications for the iPad.
  • Geo-location - the location based abilities of small devices can be well applied to learning.
  • Tablets - in general will gain in popularity and will gain upon the popularity of the iPad.
Internet Platforms
New and existing technologies continue to support and build the internet as the platform for business, learning, social interaction, media, etc. The trend will continue where the internet as whole will be the platform used to build and support peoples and organizations endeavours, regardless if they are social, business or personal. These items have influence the internet to be the platform;
  • Co-location - the ability for organizations of any size to move their internet application software and related servers into specialized facilities could be considered the first step in the internet being the platform. The co-location has enabled 7x24 servers without having each organization create its own 7x24 location. The co-location centres offer many services in support of organizations moving their internet servers.
  • Bandwidth - with always increasing bandwidth availability the internet as a platform has moved from the exchange of text and simple documents to streaming high-definition video and assorted rich media.
  • Hosted Solutions - no longer having to take care of your own servers is the next natural step after co-location. Having a technology organization who can administer servers, databases, application software and custom software lessens the burden upon an organizations IT team. The primary difference between a hosted solution and co-location is an organization does not own the servers at the hosted location.
  • Software as a Service (SaaS) - moves the care and feeding of application software up to the level of the software. With hosted solutions the organization still has the concept of servers and databases, with SaaS an organization only considers the application software and leaves the rest to the provider of the SaaS. This allows an organization to focus on its business (and the required software for success) rather than the infrastructure and HR issues required to host the software.
  • Open source - the sharing, use and availability of "free" software has been occurring since personal computers became available. Open source software now occupies most areas of software, all the way from operating systems, through databases and software development languages and environments to full end-user applications like word-processors, spreadsheets, Internet browsers and email.
  • Web 2.0 - adding collaborative features to the internet changed the whole thing. Even though this is still gaining traction within the business sector we are entering a time where businesses will increasingly struggle without a solid Web 2.0 strategy.
  • YouTube - having an open and "free" service to host video changed the breadth of media that could be made available to everyone, created by everyone. The internet became a broadcast platform available to anyone with the inclination and technical skill to create their own online videos.
  • Wireless bandwidth - as wireless technology improved it allowed people access to the internet as they roamed. This roaming ability combined with the web 2.0 abilities to contribute and collaborate open the internet to be open all the time from almost anywhere.
  • Cloud Computing - the ability to create and destroy (virtual) internet servers for a few dollars and to have these servers scale with little to no effort is how cloud computing is another game changer when it comes to the internet as the platform.
  • HTML5 - this HTML standard is a big step toward bringing browsers and mobile devices into a single development approach without the burdens of proprietary features and platforms.
Cloud Computing
The ability to create and host content online for little to no cost began with the early Web 2.0 technologies of blogging, wikis, discussion groups and social tagging. Cloud computing is a collection of servers configured in a way where new services can be requested and built in a number of minutes. These services are charged based on the amount of cpu and disk space is consumed during the life of the service. The service can exist for hours or years depending on the computing need.
  • Wiki - was one of the first collaborative publishing systems openly available on the internet. It is the wiki that began the significant shift of groups of people working together on servers hosting free 'publishing' software.
  • Blogger - blogging services like wordpress and blogger were among the first generally accepted Software as a Service applications. These and other web 2.0 applications began the move of people working exclusively online.
  • gmail / google apps - Google has created a suite of software that runs exclusively in internet browsers. This collection of software as a service is gaining acceptance and has further proven the viability of cloud computing for hosting business applications in the cloud.
  • Amazon - the amazon elastic compute cloud (EC2) provided one of the first consumer cloud services focused on providing organisations a place to host their applications without having to consider the hardware infrastructure
  • Rackspace - introduced an alternative to Amazon where they allowed organizations to instantiate a Linux virtualized server of their choosing (Redhat, Ubuntu, Etc...) to host their custom or open source application software. This differed from the original EC2 in that you were not restrained by the application framework dictated by Amazon. This has since chnaged and Amazon also offers virtualized servers.


  • Education as a Service (EaaS) - educational software as a cloud based service is coming. There is many articles which discuss the changes coming in higher education, in particular the higher education bubble. There is a lot to read in this area, and these three articles provide a good background;
    1. http://en.wikipedia.org/wiki/Higher_education_bubble
    2. http://www.forbes.com/sites/peterjreilly/2011/11/02/when-will-the-education-bubble-explode/
    3. http://www.economist.com/blogs/schumpeter/2011/04/higher_education
    The two main factors that will encourage the growth of EaaS are; First, cost; it doesn't make sense that an activity (education) in place for the public good has so much redundancy when it comes to Information Technology. Every institution should NOT have their own IT infrastructure. Second, internet; more and more people using better and better approaches will increasingly be learning online, 7 x 24. People just won't physically go to school, they will attend online and cloud based EaaS platforms will be a cornerstone to this ability.
  • Virtual Asynchronous Conferences - the cost of traveling to and hosting conferences combined with improving asynchronous conferencing will give greater rise to people attending conferences and like learning events online.
Software
About a decade ago it was popular discussion within the technology industry that we were shifting from hardware innovation and growth to software innovation and growth. The idea being that we had lots of hardware / infrastructure to build and host the software, and the growth area was no longer hardware but software. I completely agree with this hypothesis and we see much evidence in how the greater amount of innovation is within software. When you consider the adult learning space the software having an impact includes many offerings.
  • Open source - the idea of sharing free software has been around for many years and has considerable influence in all software development arenas. In particular, higher education.
  • Learning Management System (LMS) - is an open source system for managing and shepherding students learning. it will continue to have an influence within adult learning even though it is being 'replaced' or supplemented with web 2.0 approaches.
  • Really Simple Syndication (RSS) - made content distribution a pull technology rather than a push. This allowed people the autonomy to access the content they had identified as interesting when they were ready to consume the content.
  • Virtual Environments - provide online places for people to 'inhabit' while they learn, socialize or play games. Virtual environments are being increasingly used for learning.
  • Slideshare - allowed for free publishing of presentations with aligned audio (if desired by the publisher).


  • Search - It's not information overload. Its filter failure. This idea encourages innovation in search and other tools that will provide people with the ability to filter and find the information they require within the context and subject specific areas they are searching.
  • eDiscovery - is the next step of search where greater reach, intelligence and purpose is given to traversing large amounts of information from a variety of sources (digital and otherwise) to find specific and meaningful results.
  • Augmented Reality - having software and tools to assist in understanding information (in particular, large volumes of data) will increase as the technologies to support this ability become commodities. Augmented reality will provide new ways to visualize and interpret information, greatly assisting in knowledge management and learning.
  • Personalized Filtering & Focus -The ability to personalize the emerging technologies of context specific search, eDiscovery and augmented reality will assist each individual with their knowledge management and learning needs. This personalization is furthest on the horizon for personal learning and related technologies.
What does all this mean?
The main gestalts I get from all this reading, research and reflection are as follows;
  1. Small Devices will become the preferred device for information retrieval and collaboration.
  2. Internet Platforms will continue to mature and innovate. And with the growing amounts of bandwidth and standardization available to the different platforms accessing knowledge will become increasingly easy.
  3. Cloud Computing will provide cost benefits which will further open up the innovation and commercialization of software (and education) as a service. The barriers to entry will be reduced for new innovative organizations.
  4. Software development rates will continue to increase with new platforms and software applications becoming available to meet more personalized learning needs.

Saturday, November 05, 2011

Progressive Inquiry and Transformative Learning

I believe that in finding;
  1. a trusted set of learning partners
  2. a well facilitated iterative learning approach (progressive inquiry)
  3. a robust and nimble platform for building a community of practice and transformative learning practices
these would be optimal for creating deep learning for adults and those engaged in continued professional development.

So what do I mean by all this;
  1. you can't learn everything by yourself. No matter how hard you try.
  2. people (re: mentors) will greatly assist in your learning and these should be ongoing relationships.
  3. inquiry based approaches deepen learning.
  4. online communities provide an excellent (asynchronous) source for networked learning and for meeting like minded learners.
  5. collaborative technologies have come along way in the last 10 years. And you should endeavour to use these technologies, and find the technologies that work well for you.
  6. transformative learning is about pushing boundaries, sometimes really far. Learning should oscillate in and out of your comfort zone.

Monday, October 31, 2011

Creating IT Roadmaps, Pedagogical Trends

If you have stumbled upon my series of posts on technology roadmaps and you have been looking at the associated graphs you may be wondering where I have come up with all the data and labels for the trend graphs. Described here is how I have derived all this information for the Pedagogical Trends Graph. Every line on the graph shows saturation within the overall trends being analyzed. In this pedagogical graph, behaviorism is losing its saturation and almost disappearing, constructivism is losing influence and connectivism is on the increase. The lines are there to identify trends and to identify events. And yes, there are many other event that could go on all of these lines... feel free to email peter@rawsthorne.org other important events.


Behaviorism
Behaviorism could be described as teaching that is meant to alter behavior. I see teaching and learning to tests is behaviorism. Studying and the memorization required for the LSAT or SAT could be considered behaviorist approaches to learning. It is believed that behaviorism has been in decline since the early 1990s.

Constructivism
Constructivism is the idea that personal knowledge and meaning is built on a persons interaction between their experiences and their ideas. Constructivism has been the predominant approach to adult learning and only since the introduction of the internet has it been "replaced" with other emerging approaches. Its hard to say the emerging pedagogical approaches brought on since the introduction of the internet have (and will) replace constructivism. Its better to say, constructivist approaches will continue as the foundation to adult learning, though it will be blended with the learning approaches well suited to the internet and personal technology. There are a number of events along the constructivism trend that have influenced its saturation.
  • Increase in self-directed - people are learning on their own and building knowledge upon their current knowledge. This self-direction supports constructivism and also open the learner to new and different approaches.
  • Connected learning - the recognition that learning happens within groups of online connected people has gained acceptance. Connected learning has brought new learning theories to the fore, taking away from constructivism (or maybe better said as building upon constructivism).
  • FOSS - Free and Open Source Software has steadily influenced learning systems development. Moodle is an Open Source Learning Management System (LMS) built upon contructivist learning approaches. It has influenced the integration of open source technology and pedagogical approaches. This is important when considering a technology roadmap for open source technologies are increasingly being used in the educational / learning space.
  • Polling (Clickers) - The idea of audience response is becoming more proven as a pedagogical tool and is finding its way into not only the traditional classroom, but also in many online learning events.
  • Drupal - another open source software system that is increasingly being used to build learning environments for adult education. The strength of Drupal is how it has been built from the ground up as a content management system with great social and collaborative features.
Self-directed
Self-directed learning combined with technology and access has shifted people from learning in institutions and with traditional approaches to seeking alternative ways of learning. What I find as interesting (and from a roadmapping perspective) is self-directed isn't so much a pedagogical approach, but the personal motivation to strive and learn new things. Whatever the motivation. There are a number of events along the self-directed trajectory that have influenced its saturation.
  • Connectivism - is a learning theory for the digital age. Currently, I see this theory/approach being used by and influencing the self-directed learner. As connectivism hasn't gained traction in the traditional institutions, the majority of learners are still influenced by and studying within traditional approaches. For the time being connectivism is for the self-directed.
  • Creative Commons - this content licensing approach has brought the issue of copyright into the mainstream and contributed attention to how fair-dealing / fair-use can be utilized by the independent learner.
  • Progressive Inquiry - inquiry based approaches are gaining in popularity as they encourage the learner to become more involved with what they are learning. Inquiry based approaches deepen learning and provide approaches for the learner to attain mastery.
  • Personal Learning Environment - is a mix of technologies that support the self-directed learner. Essentially the internet is the platform and the learner chooses the technologies and services available on the internet to create their own learning environment used to capture and progress their learning.
  • Khan Academy - is an excellent example of the internet resources available to the learner at no cost using a content licensing approach that encourages the user to use and reuse the content themselves.
  • Accreditation - accreditation models will grow for the self-directed learner. Badges are an example of this self-accreditation.
  • Assessment - mass collaboration or a more public form of assessment without institutional involvement will emerge. In a way, assessment and accreditation is a form of reputation management.
Connectivism
Connectivism is a learning theory that was created by George Siemens. The theory is based on the premise that the digital connected world requires new learning theories. These new theories and approaches need to be grounded in  supporting the learner to interact with peers, mentors and learning resources differently as so much of this activity now occurs online. Even though George Siemens first published the theory of connectivism in 2005, a number of events occurred previous to 2005 that could be considered influential in the theories creation.
  • Blogging - on a regular basis has great benefit to the learner. It provides a platform for self-reflection. And due to its public nature, self-publishing to a blog increases the quality of writing and deepens learning.
  • Wiki - by its nature encourages collaboration, online discussion and contribution around specific areas of knowledge. Working with others in creating and editing wiki pages is connected learning.
  • Tagging - also known as social bookmarking, creates a taxonomy for individuals and, if well stewarded, learning communities.
  • Social Media - the origins of facebook was in creating a platform for students to study and prepare for coursework and tests. It has grown much farther than that, much of social and collaborative media facilitates discussion and knowledge building around learning resources.
  • Massive open online course (MOOC) - the MOOC is a very innovative and an amazing idea when it comes to connectivism and teaching a large network of collaborative online learners. I do believe the MOOC is still in development as a learning tool, they seem to be gaining acceptance and utility is teaching a very large network on learners.
  • iPads / Tablets - being able to engage with learning resources anytime from almost anywhere will open opportunities for learning. Particularly when all your information devices (television, computer, small device) are aware of the learning occurring on each device.
  • Smart phone - cell phones, smart phones, etc. offer a reach for learner engagement and collaboration that can extend the learning opportunities beyond the small devices. Bringing the smart phone into the connectivist mix is worthy of a blog post in itself. Stay tuned...
  • eBooks (collaboration) - eBooks are coming of age, particularly those with social media and collaborative reading.
  • Reputation Management - All that you do online becomes a part of your online reputation. Your online reputation is the persona you hold within your connectivist learning. Tools and approaches to reputation management will grow and support everyone as a learner and potential mentor.
  • Badges - learning badges are the front edge of learning recognition. A good idea worth exploring... but to early of an entry to really get a deep sense of where they will end up.
Blended
Blended is blended! Utilize as many learning resources available to you from as many different sources as you can find, bring them together in one place if you can, this is blended learning. Participating in an online discussion, attending a lecture, reading an academic paper, collaborating over a wiki page, a hallway discussion and time with a friend discussing ideas from a magazine article all add up to blended learning. It has become accepted that informal learning makes up the majority of a persons learning and the online resources that support blended learning are increasing. These are some of the items that are increasing and encouraging blended learning;
  • Internet - the internet is the platform for learning and it provides many possibilities to blend learning resources and to build personal learning networks with others of similar interests. The internet (and related technologies) can also blend well with traditional approaches to learning.
  • Open Educational Resources (OERs) - it is the creator of the OER that learns the most. Over the last 10 years there has been considerable activity within Higher Education toward the creation and use of OER. In the long-term OER will have increasing acceptance and availability, and those who collaborate, create and reuse (rather than only consume) the OER will learn the most.
  • Online conferencing - bringing together like-minded people to discuss and exchange ideas is becoming increasingly well supported through web-conferencing. The online-conference is another source for blended learning.
  • DIY U - Do It Yourself University is as much a political movement as an idea that puts the responsibility and cost of an education back into the learners control. From a blended approach it is encouraging the learner to seek alternate avenues to gaining an education.
  • Education as a Service (EaaS) - with the growing success of cloud computing combined with growing internationalization in higher education the discussion around Education as a Service is increasing. EaaS is a future and once available it will increase options available to blended learning. One of the key features of EaaS will be the tools available to manage the progression of a persons learning and to encourage deep learning, assessment and accreditation.
  • Learning Analytics - is at the early stages of becoming an approach used within learning and education. This could potentially have a big impact on assessment and accreditation within blended (and all) learning approaches. Stay aware of learning Analytics.
Internship
The idea of an intership is to find like minded people or an individual to assist you on your learning journey. The learning internship builds upon the apprenticeship model of learning with the addition of other learning before and during the internship. It is the authors belief that the internship trend is currently decreasing and will again begin to increase once greater acceptance of the internet as a learning platform occurs within traditional learning and accreditation institutions. The two themes that influence internship are;
  • Community of Practice (CoP) - the community of practice is well supported by online tools and techniques. Joining an online CoP and collaborating with others is one of the current methods of online internship.
  • Super-mentor - the idea of the super-mentor comes from Curtis Bonk. I agree with his thesis on the future of learning, I see the super-mentor playing a big role in many peoples internships.
What does all this mean?
The main gestalts I get from all this reading, research and reflection are as follows;
  1. Behaviorism is in decline and will remain so. As an educational practice it will flatten out and remain present as long as standardized testing remains. Sigh...
  2. Constructivism may decline in how much it "saturates" pedagogical approaches to learning, though it will remain the foundation to all emerging learning approaches.
  3. Self-directed learning will continue to grow as more people adapt, learn and take advantage of the approaches that are increasingly available on the internet.
  4. Connectivism will become an accepted theory supporting learning in the digital world. An increasing number of tools and approaches will come available on the internet to support connectivism.
  5. Blended learning will become the standard approach to learning. It will take the best from all  approaches and allow the learner to adapt them to suit their needs.
  6. Internship will become increasingly available as the acceptance, approaches and people become more familiar with its importance. This will come from both the learner and mentor side of the learning relationship.

Wednesday, October 26, 2011

IT skills and managing your partners

Three realities to consider when running an organization;
  1. Your organization is becoming increasingly dependent on Information Technology (IT).
  2. Good, I repeat, Good IT professionals are becoming increasingly difficult to find.
  3. Your organization should focus on what it is good at. And unless you are an IT vendor, consultancy, etc. your organization should not staff up for IT, for it would distract you from your focus.
Given these three realities this is how I see your organization manage IT.
  1. Develop technology partnerships to fulfill different sectors of your IT needs; find awesome Subject Matter Experts within these partnerships (you should not have to pay a partner to develop a subject matter expertise). The different sectors could be; infrastructure, software development, accounting systems, web development, mobile... etc. How you divide up these sectors depends on how your organization is structured (and divided), and who holds the responsibilities and accountability. You may find that one of your partners provides services to more than one of these sectors (preferably your strongest and most trusted partner).
  2. Get to know your partners strengths and weaknesses, meet with them face-to-face regularly. With increasing difficulty in finding good IT partners you may find that "good enough" is all you can get. So be prepared to manage each partners abilities differently... work with their strengths and manage their weaknesses. 
  3. Have a trusted, highly available partner that is invested in keeping everything working together and is nimble in making fixes and enhancements to your customer facing technologies. This is where you may have an employee or small IT department as you may not be able to find a partner to make this level of commitment.
  4. Seriously consider moving your customer facing infrastructure and websites (including mobile) to a hosted or cloud based environment. 
  5. Have very strong IT management skills at the executive (and board) level.

Monday, October 24, 2011

Creating IT Roadmaps, Gathering Data

This is a second post in a series of posts describing a technology roadmapping exercise I am completing. All the posts in this series can be found under my roadmap label for this blog. This post focuses on the how, why and where I am gathering data, with beginnings of how I am organizing and visualizing the data.

1. Narrow the subject area and context

This roadmap will focus on adults engaging in continuing professional development and life-long learners focusing on legal education for lawyers, legal assistants, notaries and self represented litigants. In general, the audience is focusing on accessing legal materials and related learning resources published from a number of online sources, both public and private.  The context for access is usually for researching a subject of personal or professional concern over a (long and short) period of time. The assumption being that the longer the duration the greater the depth. This does not mean that short bursts of access is not seeking depth of learning.

The main threat is within two areas. Firstly, in published materials. Not in the published materials being replaced, but the customers are expecting them being available on a new device which eases access geographically and 24x7, and allows greater collaboration around the published materials so they are more relevant and up to date. Customers will increasingly seek published materials being made available in this way. Second, is with online programs, courses, workshops, etc. Blended and online learning is growing and this eases the need to travel and allocated set blocks of time to attend learning events.

2. Know your audience

The audience are adult learners with a post-secondary level of education. Their learning styles are going to be constructivist with a strong influence from connectivist approaches. Increasingly these learners are looking for alternate ways to access learning materials. These alternatives are both geographic (reducing the need to travel and access from any device any time) and the ability to access learning resources 7 x 24. When a learner leaves working on one device the next device they resume their learning has knowledge of where they left off.

Understanding the technology adoption rates for your audience is very important. The challenge is finding data defining technology adoption rates for specific audiences and the adoption rates for the different demographic groups within the audience. If you have the resources doing surveys targeted toward your audience can be very helpful. Otherwise, staying aware of technology trends and bookmarking or tagging technology adoption is a good way to gather data. I have often tag resources related to technology adoption, they fall within my "roadmap" delicious tag, follow it here; http://delicious.com/prawstho/roadmap
 
3. Acknowledge that roadmaps are visual tools

Within this roadmap there are a number of different attributes that need to be represented in a single (well, potentially multiple) visual(s). As my research of these attributes deepened they began to fall into three main categories;

Pedagogical - events, ideas, new theories, approaches that relate to teaching and learning.
  • emerging and existing learning theories
  • emerging approaches to online learning and teaching
  • social and collaborative technologies well suited to learning
Technological - current and emerging technologies well suited to and influencing adult learning.
  • personal devices and browsers
  • internet and technology platforms 
  • application software well applied to learning
Sectoral - subject or business sector attributes to be considered or will influence the roadmap.
  • strategic plan (known initiatives)
  • financial & economic
  • jurisdictional issues
  • threats


Suggested Reading
http://www.downes.ca/me/mybooks.htm
http://www.nmc.org/horizon-project/horizon-reports
http://criticaltechnology.blogspot.com/search/label/roadmap

Saturday, October 22, 2011

Embedding Google Docs

If you want to publish (and embed) a Google doc into a blog post this is easily possible as Google docs provides the embed code. The process of embedding informs you that the underlying data will be made public and read only. Which is kind of the point of blogging about some data you have created. Here is a chart I put together for a roadmapping exercise I am currently completing.



What are my data sources for putting together this graph? It is an accumulation of a number of things; 25 years working in technology and being an educator graduate level studies in Education with a focus on Information Technology, constant monitoring of RSS feeds, blogs, online publications, a deep curiosity of the subject of educational technology and the reading a number of reports on the subject, in particular; Horizons Reports and the exemplary work of Stephen Downes.

Tuesday, October 18, 2011

Learning Architect

Clive Shepherd gets it! What he is speaking of is closely aligned with my idea of a Learning Systems Architect from a while back. I would say the difference between the two is my role is more technical. I really need to read the book if I am get a complete sense of Clive's Learning Architect. From what I have read so far by browsing his companies site, reviewing his new books index, and listening to the embedded video the Learning Architect designs the pedagogical approaches and recommends the technology platforms to best support the learning. The Learning Systems Architect I speak of works with Subject Matter Experts to design the pedagogical approaches and implements (builds if necessary) the technology platforms to best support the learning. I would see the Learning Architect and the Learning Systems Architect working back-to-back; where the Learning Architect is facing toward the learner and the Learning Systems Architect is facing toward the technology. Regardless of how you see things, if you are into adult education this is a good video to watch.


On a closing note, I really appreciate the way he classifies learning into four approaches; formal, non-formal, on-demand and experiential. The Learning Architect role can be read about in his new book, "The New Learning Architect". Even a browse through the index and what technologies fall into the four approaches can provide insight into learning in the near future. Thanks Clive!

Monday, October 17, 2011

Rackspace Step 5: Updating the DNS

It's been a while since I posted on my work with moving all my sites over the rackspace, it's been summer and the start of the school year for my kids. The task I intermittently focused on through the summer was to move my domain name hosting over to rackspace. Its great that rackspace also provides a DNS based cloud service, and I like the management console available to manage your DNS.

 
Moving DNS servers may not be so simple
Usually you would think that changing DNS servers would be a simple, and it should be. Depending on where you start and who "controls" the ability to update, things may not go as smoothly as you would like. I mention this because without a good move of your DNS your site may disappear from the internet for a period of time. What I want to say is, "When moving your DNS it is important that you monitor the move closely". This is what happened to me and a similar series of events could happen to you;
  1. I logged into my previous providers domain hosting console and changed the domain name server for the domain I was moving. I was prompted the save was successful.
  2. I went back to the console to see what name servers were assigned to the domain, it was still the old names. I figured this was OK because name server changes need to be updated through-out the internet to truly complete.
  3. A couple days later I logged into the domain hosting console to check the name associated with the name server of the domain. It was still set to the old name server. Naturally, I tried again to update it myself. And again I got a confirmation of the change.
  4. I got busy and a few days later I checked the names again and my DNS was still pointing at the old name server. I wrote an email to tech support, sent it off and waited.
  5. Almost immediately, I got confirmation of my query and was assigned a tracking number for the issue. A few days later nothing, so I phoned... I did speak to someone and they confirmed they had made the change, to the correct domain name. I was adamant about this and they confirmed the correct domain name.
  6. The next morning I logged into my domain hosting console and discovered they had made the name server changes to the incorrect domain. 
There is really no point in going any further with this description, and eventually I got it all cleared up. Needless to say, all this was only confirming I was doing the right thing to be moving away from netnation as my hosting provider. Don't get me wrong, netnation has provided me with many years of very stable hosting. Its just my needs have changed and the cost savings provided by cloud based services are too strong to ignore. The main lesson learned is when making changes to things DNS related you need to monitor it very closely, particularly when their are intermediaries involved...

Suggested Reading
http://en.wikipedia.org/wiki/Domain_Name_System
http://www.rackspace.com/cloud/blog/2009/06/04/dns-the-overlooked-cloud-service/
http://www.rackspace.com/knowledge_center/index.php/Managing_DNS

Saturday, October 15, 2011

Creating Information Technology Roadmaps, Getting Started

Creating technology roadmaps can be hard. Mostly because you are trying to predict the future. And predicting the future is, well, unpredictable. So coming up with a technology roadmap for a specific subject or practice area narrows the horizon and could increase success. Gaining as much knowledge of the narrowed area by reading, reviewing and referencing as much existing information and related predictions will help greatly. Essentially, you want to gather all the applicable technology and subject area information you possibly can regarding the present and future and try to gestalt a technology roadmap.

The important factors in creating technology roadmaps are;
  1. Narrow your subject area and context
  2. This is important mostly due to narrowing the number of attributes influencing the future. The subject area is somewhat self explanatory, is it; higher education, medical, financial, legal, etc. Context would be; mobile technology for adult customers, wealth management for families, etc.

    One important attribute here would be to identify any serious threat(s) to the financial health of your organization due to a disruptive technology or competitor. These unforeseen threats rarely occur if you have been doing regular roadmapping... for they should identify the threats...
  3. Know your audience
  4. The audience who will reference the roadmap is important for they will read it based on their decision making needs. The audience can be as varied as; senior management, customers, business partners even competitors.
  5. Acknowledge that roadmaps are visual tools
  6. People have become used to roadmaps being visual tools, invest the time in finding a visual representation that suits your audience. Engage your audience early, present a visual framework and get feedback. Improve the visual. This has two benefits; it assists in the audience learning how to use the roadmap and assists the creator in understanding the audience and the issues of why they need the roadmap.
  7. You don't know the destination, only important attributes of the journey
  8. When predicting the future with a technology roadmap there is no destination other than the many factors that influence the decisions you make on the journey. The technology roadmap will provide a topographical map and the roadways that are available to you when making decisions. It is assessing the current location and the things of importance around you (which will change through time) that will determine which route to take.
  9. Sometimes the journey is the destination
  10. It is having to make the decisions about the journey that are the technology decisions required by the organization. Using the roadmap to know where you are and the current surroundings is the what the technology roadmap is for. It helps in making the technology decisions right in front of you, no more. Really that is what is needed anyhow.
  11. The roadmap should align with the organizations vision and strategy
  12. The roadmap should be derived from the organizations vision and strategy. If their is no vision or strategy this should be done before the roadmapping exercise has begun.
  13. The technology roadmap will influence the organizations tactical plan
  14. Also derived from the organizations visions and strategy are the tactical plan. The tactical plan and the roadmap work together to drive the individual projects tasked with implementing the vision and strategy.
Step 1. Start writing openly about these seven factors and how they apply to your roadmapping exercise. Be open and transparent about your thinking an seeking feedback is very important. Using an internal (or privately external) blogging approach and allowing people to comment would be a great way to be open, transparent and solicit input.

Step 2. Begin to gather all the technology roadmap material you can. Search high and low, contact your vendors, contact your peers, investigate industry publications, look for other technology roadmaps. Leave no stone unturned.


Step 3. begin to create a visual representation of what you are finding. Be creative, seek different sources for inspiration. Publish the visual frequently to begin soliciting feedback, and developing a shared understanding. It is the feedback and shared understanding that will improve the accuracy of the roadmap.

For a growing list of references on this subject feel free to follow my roadmap tag in delicious;
http://www.delicious.com/prawstho/roadmap

Follow-up Posts
If you have read this far you may be interested in the follow-up posts I have written that actually implement what I have described here;

Friday, October 14, 2011

MVC in a three-tier architecture - TRANSLATED

A month back I wrote a post on architecting web and mobile based applications. In the post I spoke very technically about the MVC pattern and three-tier architectures. One of the comments I got on the post was from a very bright friend of mine who also works on educational technology and professional development, only from a senior management perspective. His comment was in really wishing he knew what I was talking about. And after reading the post again, I agreed that for a non-technical person the post would have been difficult to understand, I also felt there was good value in translating it for the non-technical person. Explaining the MVC pattern and three-tier architectures in this way would have great value to those who want (even need) to understand the web, mobile technologies and how these are put together. So this post attempts to answers that need... and if it works out, I may rewrite a number of my posts for the non-technical person.


Model-View-Controller
The Model-View-Controller (MVC) is a design pattern used to design the user interface and activities of a software application. In other words, what does a web page or mobile application look like and how does it work. How is the software designed so pictures, words, links and buttons show up, and what happens when someone clicks on a button or a hyperlink. The individual items of Model, View and Controller each serve a purpose; the Model is the business and related data and processes, the View is what is displayed to the end user and the Controller handles the events between what is displayed and how the business responds.
  • The View is what is rendered (drawn) on the screen. How it is rendered is where the software part comes in. The technology (programming) behind rendering a View (or screen full of information) includes many options and approaches. Beyond a lot of formatting and graphic design, what makes the work behind the View is that there are many screens to that the software has to render a view. These screens come from different types of computers (PCs, Mobile, Tablets, Etc.), different browsers (firefox, safari, chrome, ie), different sized screens, Etc. Writing software for these different screens takes work and isolating the code into that responsible for only rendering the View greatly simplifies what can become complicated.
  • The Model is the business logic or software that collects and retrieves any data required for the View. When data needs to be retrieved from or saved to the data storage it is the model which is responsible. The model can contain a lot of complicated business logic. As an example, someone submitting a credit card transaction may have only a few data fields and one button click on the view but the amount of different business activities (or software modules) that get utilized to complete the credit card transaction can be numerous and span different computers and businesses.
  • The Controller responds to user events. The events can trigger changes or activities from both the View and the Model. As a simple example; a user enters their user credentials (user name, password) and clicks the log in button. The Controller then initiates a call to the Model which in turn executes the software required to find the user name and confirm the password... the Model then returns a call to the View with the pass or fail of the log in and the View re-renders itself with either a logged in user interface or the "forgot password?" user interface.

There you have it, simply put; the View is responsible for drawing the screen, the Model is responsible for retrieving and adding / updating information and the Controller is responsible for managing the events between the View, the end-user and the Model. Once you feel comfortable with your understanding of this MVC design pattern, move of to the next section describing the three-tier architecture. Hold off asking yourself how the Model-View-Controller (MVC) fits in the three-tier architecture for now. That will be discussed later.

Three-Tier Architecture
A design / concept of the three-tier architecture was created over 20 years ago, and was initially adopted when building client-server software. The idea being the personal computers on the network were the clients and the big computers on the network were the servers. And the clients made requests to the servers, the servers talked among themselves (business logic servers and database server) figured things out and then the servers responded back to the clients. This basic idea continues. Now the clients are web browsers and other devices, and the servers are web site servers, business logic servers and database servers.

Why so many servers? Performance, security, reuse, maintainability and understandability. There are actually other reasons to implement a three-tier architecture. I see these as the primary reasons;
  1. Performance, security - Any interactive website with a significant amount of traffic could not be hosted on a single computer for performance reasons. The software needs to be constructed in a way where it can span multiple servers. The best way to have software span multiple servers is to modularize the software based upon its activity. Within a three-tier architecture modules would be constructed for the specific activities of user interface rendering, business logic, database reads and database writes. Each of these modules would be optimized for its activity and implemented across servers best suited to the modules performance needs. Another powerful reason for separating modules and hosting them on different servers is for security reasons. The closer a software module is to the storage of a piece of content (data, rich-media, documents) the stronger the security needs to be. It is also important to mention the idea of a cache, when building high performance web sites it is important to utilize a cache. The main idea of a cache is that it takes frequently requested content and makes it more quickly available to the web site.
  2. Reuse, maintainability - modularizing software enables reuse and increases maintainability. The idea being that many modules can call one module to perform the same activity. As an example; saving both a bill to address and a ship to address is almost exactly the same activity, this should be done through a single module. This also eases maintainability, if you need to fix or update the address saving abilities of the software it only has to be done in one module.
  3. Understandability -  understanding how a website application is put together, particularly when it spans multiple servers, becomes increasingly important as time passes. People change, business approaches change, features get added and updated. Having the IT Team easily understand how to enhance and maintain a website is greatly improved when the software is well organized and built in a modular way.
Building software in three-tiers provides the flexibility to organize the software to meet performance and business needs while it is operational. How the software is going to perform under user load is not always known until after the release of the software onto the internet. This is more true for web based software as performance will be determined by user traffic and users are unpredictable. Business needs will also change, so being able to alter the software with little to no impact to existing software modules is more easily done within a three-tier architecture. Well designed three-tier architectures are more easily understood and maintained than other approaches.

The MVC implemented in a three-tier architecture 
Figure 1. The MVC / 3-Tier Hybrid
How do these two come together. From a MVC perspective the View and Controller exists in the presentation tier and the model spans the business and data tiers. From a three-tier perspective this means the Model is broken into modules which are optimized based on their activity and for their ability to be reused and altered for new business opportunities. The Model will span the business and data tiers. The Controller and View exist in the presentation tier and this is also optimal for it allows the software developer to build, render and respond to user interfaces best designed for the different devices (internet browsers and mobile devices).

So how is all this best described from a non-technical perspective. We will start at the top and work our way down and back to the top again. Our scenario will include three activities. Arriving at a web site, logging into the site and adding a ship to address to a persons profile.

Scenario: A user wants to update their profile so they have separate ship to and bill to addresses.
Note: this scenario may not be as all websites handle these activities, it serves as an example to explain the MVC pattern and three-tier architecture.

Arriving at a web site
The user arrives at a website and the sites main page is displayed. The main page is rendered in the following manner;
  1. Software in the Controller will determine the type of device browsing the website. Once the device / browser type is determined the Controller requests the correct View to render itself. Specific Views are built to service the different device and browser types.
  2. The rendering of the View will make requests of the Model to fetch data (text, images, etc.) from the business and data tier modules. These modules will mostly fetch data from the Reads side of the data storage (see Figure 1. The MVC / 3-Tier Hybrid). For performance reasons, some websites will have these reads come from the cache. Many Views will also use style sheets and templates to help in their rendering.
  3. All this activity takes only moments in a well designed website and once completed the View will be rendered and the user can view and interact with its content.
Logging into a site
After review of the sites main page, the user logs into their account. The process is managed is as follows;
  1. The user types their email and password into the required text fields and clicks the associated button. This click event is intercepted by the Controller and decides the appropriate action. In most cases, some software (usually javascript) is run within the Controller to check the username and password are complete and follow some basic rules. If all is good the Controller will pass the email and password onto the Model as parameters for the email and password verification request.
  2. The Model will take these parameters and make a request to a module built to handle user related requests to verify the email and password are valid.
  3. The module dedicated to user requests will query the Reads side of the database modules to fetch the users password based on their email address provided in the parameter(s). 
  4. If the password matches the module will pass TRUE back up to the Model, and the Model will request the View re-render itself with a valid login. This re-rendered View will often have additional menus for the user to perform additional actions not available to the non-logged in user.
  5. If the password doesn't match the module will pass FALSE back up to the Model, and the Model will request the View re-render itself with an invalid login. This re-rendered View will again prompt the user for their email and password with the additional prompt of "Can't access your account?".
Adding a ship to address
Once the user has successfully logged in they will have the ability to edit their profile. The process of displaying their profile page and adding a ship to address would be as follows;
  1. The profile View would make requests of the Model to fetch all the data required to render the persons profile data. This data fetching would occur via business and data tier modules built to handle the user related requests.
  2. These requests would occur on the Reads side of the database modules and once completed the Model would request the View to re-render itself with all the users profile data.
  3. Texts fields for the users ship to address would be presented and the user would complete the required fields. The user would then click a button to update their profile and save this new address to the database.
  4. The click event is intercepted by the Controller and the appropriate action would be performed. Again, in most cases, some software (usually javascript) is run within the Controller to check the data entered follows some basic rules for accuracy.
  5. The Controller would make the request of the Model to save the ship to address. The Model would make requests of the business and data tier modules specifically built to save address data.
  6. The request to save this new address would occur on the Writes side of the database modules. Database writes are different than reads and this is explained in greater detail in a related post titled, "separation of database reads from writes". 
  7. If the saving of data is successful two activities will occur; first, a confirmation will be sent back up to the Model, and the Model will request the View re-render itself with a confirmation of the data being successfully saved. Second, the data saved to the write side of the databases will be synchronized with the Reads side making it available for any subsequent read request.
  8. If the saving of data is unsuccessful a return-code will be sent back up to the Model, and the Model will request the View re-render itself with correct error message. The user would correct the errors and the round trip of saving the data would occur again.
Why the hybrid?
Because building websites can be complicated, particularly when the site engages users and have a lot of content that can be targeted toward and created by users.  Why the hybrid of MVC and three-tiers? Two main reasons; First, because the MVC pattern does a great job of simplifying and managing the development of user interfaces over multiple devices and browsers, but it doesn't do a good job of defining how to build scalable server infrastructures. Second, The three-tier architecture does a great job of simplifying and managing the development of high performing, scalable, extensible and maintainable server infrastructures, but doesn't do a good job of defining how to build user interfaces across multiple devices and browsers. With a MVC three-tier hybrid you can utilize the best of both approaches without compromising either.