Key Findings: 2014 Inside Higher Ed’s Survey of Faculty Attitudes on Technology


There seems to be agreement about Early Warning Systems, which are reportedly in use at the institutions of 89% of responding faculty. Early Warning Systems are believed by 81% of faculty to assist in students learning gains.

There is less agreement on other points. For example, while 36% of academic technology administrators but few faculty (9%) strongly agree that online courses can achieve equivalent outcomes of in-person courses. The top three factors indicating quality online education were those that provided meaningful interaction between students and instructors (80% of faculty; 89% of administrators); be offered by an accredited institution (78% vs 84%); and independently certified for quality (50% vs 68%).

Half (51%) of all faculty respondents believe that the introduction of more active learning is an important reason for converting traditional face to face courses to blended or hybrid courses. Yet, more than 8 in 10 instructors say they have converted a traditional course to a hybrid course which resulted in decreased time for interaction between students and instructors.

The annual Inside Higher Ed’s annual technology survey yields many important insights. One possible conclusion is that professional development might well address effective ways to engage and support the interaction of students and instructors; and provide team-teaching opportunities to interested faculty who have not yet taught (2 in 3 professors) or taken an online class themselves (32% of faculty).


Jaschik, S., & Lederman, D. (Eds). (2014). The 2014 Inside Higher Ed Survey of Faculty Attitudes on Technology. Washington, DC: Gallup, Inc.


Five Differences between #LAK & #EDM?


, , , , ,

There are five key differences between Learning Analytics & Knowledge (#LAK) and Educational Data Mining (#EDM). Let’s start with some definitions:

The International Educational Data Mining Society defines educational data mining as: “concerned with developing methods for exploring the unique types of data that come from educational settings, and using those methods to better understand students, and the settings in which they learn in” (Siemens & Baker, 2010).

The Society for Learning Analytics Research defines learning analytics as: “… the measurement, collection, analysis and reporting of data about learners and their contexts for purposes of understanding and optimizing learning and the environments in which it occurs” (Siemens & Baker, 2010).


1. Discovery

LAK: Leverages human judgment is key; uses automated discovery as a tool to accomplish this goal. EDM: Automated discovery is key; leverages human judgment as a tool to accomplish this goal.

2. Reductionism / Holism

LAK: Emphasis on systemic understanding. EDM: Emphasis on analysis of components and relationships between components.

3. Origins

LAK: Stronger origins in semantic web, outcome prediction & system interventions. EDM: Stronger origins in educational software, student modeling & outcome predictions.

4. Adaptation & Personalization

LAK: Greater focus on informing & empowering instructors and learners. EDM: Greater focus on automated adaptation (computer w/o humans in the loop).

5. Techniques & Methods

LAK: Social network analysis, sentiment analysis, influence analytics, discourse analysis, learning success prediction, concept analysis, sense-making models. EDM: Classification, clustering, Bayesian modeling, relationship mining, discovery with models, visualization.

(Siemens & Baker, 2010).

A Call for Communication & Collaboration:

Despite these differences, there are also many similarities such as the researchers skills sets, their research areas overlap and there is much to be gained from one another’s approaches. Siemens & Baker (2010) call for the two communities to collaborate to bring the “greatest possible benefits to educational practice and the science of learning.”



Siemens, G., & Baker, R. S. (2010). Learning analytics and educational data mining: Towards communication and collaboration. Conference ’10. Retrieved from:


Phil Hill: State of the #LMS Market; #580EdTech

State of the US Higher Education LMS Market: 2014 Edition

I shared the most recent graphic summarizing the LMS market in November 2013, and thanks to new data sources it’s time for an update. As with all previous versions, the 2005 – 2009 data points are based on the Campus Computing Project, and therefore is based on US adoption from non-profit institutions. This set of longitudinal data provides an anchor for the summary.

The primary data source for 2013 – 2014 is Edutechnica, which not only does a more direct measurement of a larger number of schools (viewing all schools in IPEDS database with more than 800 FTE enrollments), but it also allows scaling based on enrollment per institution. This means that the latter years now more accurately represent how many students use a particular LMS.

A few items to note:

  • Despite the addition of the new data source and its inclusion of enrollment measures, the basic shape and story of the graphic have not changed. My confidence has gone up in the past few years, but the heuristics were not far off.
  • The 2013 inclusion of Anglosphere (US, UK, Canada, Australia) numbers caused more confusion and questions than clarity, so this version goes back to being US only.
  • The Desire2Learn branding has been changed to Brightspace by D2L.
  • The eCollege branding has been changed to Pearson LearningStudio.
  • There is a growing area of “Alternative Learning Platforms” that includes University of Phoenix, Coursera, edX and OpenEdX, 2U, Helix and Motivis (the newly commercialized learning platform from College for America).
  • While the data is more solid than 2012 and prior years, keep in mind that you should treat the graphic as telling a story of the market rather than being a chart of exact data.


Some observations of the new data taken from the post on Edutechnica from September:

  • Blackboard’s BbLearn and ANGEL continue to lose market share in US -[1] Using the 2013 to 2014 tables (> 2000 enrollments), BbLearn has dropped from 848 to 817 institutions and ANGEL has dropped from 162 to 123. Using the revised methodology, Blackboard market share for > 800 enrollments now stands at 33.5% of institutions and 43.5% of total enrollments.
  • Moodle, D2L, and Sakai have no changes in US – Using the 2013 to 2014 tables (> 2000 enrollments), D2L has added only 2 schools, Moodle none, and Sakai 2 schools.
  • Canvas is the fasted growing LMS and has overtaken D2L – Using the 2013 to 2014 tables (> 2000 enrollments), Canvas grew ~40% in one year (from 166 to 232 institutions). For the first time, Canvas appears to have have larger US market share than D2L (13.7% to 12.2% of total enrollments using table above).
Rate This

#Textbooks are #dead!! #580EdTech

EDUCAUSE 2014: Publisher Says ‘Textbooks Are Dead,’ and Adaptive Learning Is Rising from the Ashes

Analytics that give insight about how students learn guide the future of the publishing world, EDUCAUSE session speakers say.

“Textbooks are dead. They’re dinosaurs,” said Brian Kibby, president of McGraw-Hill Higher Education.

Kibby’s show-stopping quote came near the beginning of a Tuesday session at EDUCAUSE 2014 on the future of textbooks, led by himself and Dr. Robert S. Feldman, deputy chancellor at the University of Massachusetts Amherst.

The next evolution of learning materials will be enhanced by a personalized experience powered by adaptive learning techniques, the pair posited.

Feldman, a textbook author, also teaches math at his university. Students taking his courses use digital materials that give him specific feedback about how they are absorbing the material. The data includes precise, paragraph-level — sometimes sentence-level — analytic analysis.

It’s a treasure trove of information for any author embarking on textbook revisions, Feldman says. He can turn to the data provided by his students to see how he can improve learning materials in their next iteration.

By comparing how students performed on tests to the corresponding information in the book, Feldman can predict what section — or even what sentence — might run students off course.

“I can know empirically what material is effective and what material needs work. No author has had this kind of information before,” Feldman said. “I now have confidence that the changes I’m making are actively targeting the areas that are the most troublesome to students.”

McGraw-Hill is testing these digital initiatives in a variety of classroom scenarios across several disciplines, but Kibby said they were still in the early stages of development.

During the question-and-answer portion of the session, several members of the audience expressed skepticism that these learning stumbling blocks could be so easily isolated, given the wide range of factors that may be impeding a student’s comprehension of the material.

Feldman countered that while the method was still being tested, he already sees results in his own classes and expects to see things turn around once more data has been collected.

“We think this is putting us on the road to higher retention and graduation rates,” he said.

EdTech is providing constant coverage of EDUCAUSE 2014, including video interviews, session information and tons of photos. Keep up-to-date on all of our coverage by visiting our EDUCAUSE 2014 conference page.


A review of Twitter Accounts & Blogs on Learning Analytics

NextGenLC    @NextGenLC

The stated purpose of this twitter feed is to “accelerate educational innovation through applied technology to dramatically improve college readiness and completion”. This is an Educause program. Recent tweet: “Does #blendedlearning work? …they do if they’re designed right … referencing @ChristensenInst – and linking to a article of the same name by Thomas Arnett ( blog)

George Siemens       @gsiemens

George Siemens has several youtube videos regarding learning analytics. This is his twitter feed which is a little more broad-ranging. Late yesterday (10/13/14) Mr. Siemens tweeted regarding an online discussion “Learning Analytics are about learning.” Mr. Siemens also covers MOOCs, OERs and open course design among other topics.

EdTechReview          @etr_in

Ed Tech Review describes itself as “India’s Premier EdTech Community. Spreading awareness on Education Technology. Advocating 21st Century #Education #edtech #edtechchat #21stedchat.” This is gives me a broader / global view. Many of the references/links are from Saudi Arabia as well as the US, Britain, etc.

Educause Learing Initiative @EDUCAUSELI

The Educause Learning Initiative twitter feed decribes itself as ”a community of higher ed institutions advancing learning through IT innovation.” I thought this would be useful, but the tweets are relatively infrequent (1/day or less). Six days ago, they tweeted about a recently released MOOC study. Another recent tweet (10/2/14) regarded a survey for content anchors (topics) for a conference(?)

Predictive Analytics @PredAnalytics

This twitter account is focused on predictive analytics, predictive modeling, data mining, programs, big data, big data and text analytics as well as business intelligence. Among the many relevant posts, is a post from Tableau Software regarding making a dashboard, and a tweet from Roger Schonfeld that predictive analytics resulted in 34,000 one-on-one student-advisory meetings within the past 12 months at GSU (Georgia State University?).

Chronicle of Higher Education

The Chronicle of Higher Education has several blogs gathered on this page. The wired campus blog (second url) highlights the latest on technology and education, addressing MOOCs, the recent “fair use” copyright decision, and much more. This blog casts a wide net reporting on a variety of technology-related news, serving as a good edtech news listserve.


Elearnspace describes itself with the key words: learning, networks, knowledge, technology, community. The last couple of blog posts concern the edX course on Data, Analytics & Learning (#dalmooc) which I signed up for (and attended class tonight). Another post from June highlights the MOOC Research Initiative. One of the comments made by George Siemens this evening was that the instructional methodology for MOOCs has been settled on too soon, and that in the course, we will look at various methods of engaging students in the content. From this brief introduction to the edX course. The discussion tonight touched on learning analytics related to MOOCs rather than traditional courses. At this point I’m not sure how valuable this will be to the literature review. One of the blogs specifically addressed learning analytics:

Next Generation Learning Challenges 

The Next Gen Learning Challenges is a blog related to the Bill & Melinda Gates foundation’s work in technology in education. Among the many topics included in the blog are two topics related to learning analytics: “Learning Analytics Projects” and “Learning Analytics to study and improve learning.” These two blog topics provide directly relevant current information. The learning analytics project includes a summary of funded Gates Foundation projects and a report detailing how learning analytics can support college completion.

Inside Higher Ed – Tech Ed

The Inside Higher Ed’s blog hack higher education covers a variety of news articles related to higher education technology. One of the blog posts addresses the then-current state of interest in learning analytics connecting these techniques to innovative funding mechanisms such as race to the top, among others. The Inside Higher Ed blog post on learning analytics is:

Digital balance is a digital marketing consulting company working in Australia and the U.S. Their specialty is the expanding the digital capabilities and changing digital behaviors consequent to product implementation. The blog post link above, specifically addresses the implementation of learning analytics in the higher education. The author, Hallam, suggests that while she was able to identify a successful application to support student retention at the University of Kentucky, most universities do not have sufficient staff support to make learning analytics a reality. The University of Kentucky utilized 15 FT Es to develop and run their predictive analytics model.

Five Types of Analytics

There are five types of learning analytics: [hindsight] descriptive &  diagnostic; [insight] discovery; and [foresight] predictive and prescriptive. Each of these types of analytics answers a different question. Descriptive analytics answers the question: What is happening? Diagnostic analytics answers the question: Why did it happen? Predictive analytics answers the question: What is likely to happen? and Prescriptive analytics answers the question: What should I do about it?

When descriptive analytics merges with diagnostic analytics the result is real time information on demand with greater interactivity (e.g., student success dashboards!) When this type of real-time information is extended to the student themselves, counselors, advisors, and faculty … timely meaningful feedback is possible. Real-time data and meaningful student feedback is incredibly important.

Discovery and predictive analytics detect trends, clusters and exceptions. Prescriptive analytics simulates multiple potential courses of action in order to answer the question What is the best course of action? Each of these analytics have distinct tools and analytic methods. With increasing depth of analytics, structured data can yield reports, info-documents; info-apps and dashboard data (real-time information), performance metrics, as well as ad hoc query and analysis, data discovery and predictive analysis; whereas unstructured data can be used for search-based applications, search query; text/word analytics; and sentiment analysis.


Corcoran (M). (n.d.). The Five Types of Analytics. Retrieved from:

Schmarzo, W., (2014) Business Analytics: moving from descriptive to predictive analytics. Retrieved from:

#580EdTech; #Learninganalytics

A Purdue #Learninganalytics Experiment

A Purdue #Learning Analytics #studentsuccess Experiment

Learning Analytics include a near-real time or real-time analysis of students’ performance yields actionable information resulting in meaningful student feedback such as where and when to get help. In 2012, Arnold & Pistilli examine Course Signals an early intervention system that uses a student success algorithm to predict which students that might be falling behind. The algorithm utilizes current course performance (percent of points earned in a course to date); effort (interaction with Blackboard Vista, Purdue’s Learning Management System); prior academic history including academic preparation, high school GPA, and standardized test scores; and student characteristics (exact algorithm is proprietary). The faculty member sends the results of the on-demand evaluation to each student along with a visual indicator (a stoplight traffic signal) depicting how each student is doing.

Arnold, K. E., & Pistilli, M. D. (2012, April). Course Signals at Purdue: Using learning analytics to increase student success. In Proceedings of the 2nd International Conference on Learning Analytics and Knowledge (pp. 267-270). ACM. Retrieved from:

Blogpost #5: Customer Relationship Management

Customer Relations Management (CRM) is a common feature of retailers, but its adoption by higher education has the potential for creating connections between students and student support services, and supporting students through a transfer, degree or certificate program.

El Paso Community College (EPCC) and the University of Texas at El Paso are using CRM to help students complete a reverse transfer program which retroactively gives 4 yr college students credit toward associate degrees began at a community college.

The registrars of both colleges use CRM data tools to exchange academic information on transfer students. As a result, EPCC has awarded over 5,000 associate of arts degrees since 2005. The student information system (SIS) provides student test scores, completion criteria, advancement information and academic performance over time. Many of the processes in the system have been automated to facilitate the reverse transfer process. If a required milestone is unmet, the system provides the student with completion options. Students can check their information and perform “what if” analysis options to see how their academic history matches other degree plans and helps students determine whether changing majors would allow them to graduate sooner.

Blogpost #4

This blogpost defines and examines how the concept of Communities of Practice (CoPs) can be used in introducing and integrating technology in the different areas of the educational environments; focusing in particular its use with students in the classroom and for faculty professional development.


Technology has been widely used for organizational administrative functions including among others: admission and records, financial aid, accounting, facilities planning, and institutional research and reporting. The technology helps the institution accomplish a variety of functions and meet internal and external needs, demands and requirements.

What is a Community of Practice?

Communities of Practice (CoPs) are “groups of people who share a concern or a passion for something they do and learn how to do it better as they interact regularly” (Wenger, n.d.; Wenger 1998). CoPs had their origin in apprenticeship studies and have found their way into business and other sectors such as government, education, professional associations, and the nonprofit social sector (Wenger, n.d.).

Use of CoPs in the Classroom

CoPs can affect internal and external educational practices. Topics such as: organizing school learning as grounded in practice within subject matters relevant to the community; connecting students’ experiences to participation in the broader community; and Addressing the lifelong learning needs of students by organizing CoPs focused on topics of continuing interest to graduates (Wenger, n.d., p. 5). Situating learning within the community, teaches that “life itself is the main learning event” (Weinger, n.d., p. 5).

Use of CoPs for Faculty Professional Development

Local CoPs have been used for teacher training and peer-to-peer professional development (Wenger, n.d., p. 5). Communities of Practice are a particularly effective method of learning specific teaching practices (Darling-Hammond & Bransford, 2002, p. 405). The internet has removed the geographic limitations of traditional CoPs (Wenger, n.d.) widening the sources of information and expertise. The internet does not replace the local community within which it is situated; rather, it expands the potential for new community based on passion and shared practice (Wenger, n.d., p. 6).


Darling-Hammond, L., & Bransford, J. (2002). Preparing teachers for a changing world: What teachers should learn and be able to do. San Francisco, CA: Jossey-Bass.  Retrieved from

Sheninger, E. (2014). Digital Leadership: Changing paradigms for changing times. Alliance for Excellent Education webinar August 28, 2014 [HTML video]. Retrieved from

Smith, M. K. (2003). ‘Communities of practice,’ The encyclopedia of informal education [HTML Document Last updated 30 January 2005]. Retrieved from:

Wenger, E. (n.d.). Communities of practice: A brief introduction [HTML Document]. Retrieved from:

Wenger, E. (1998). Communities of practice: learning, meaning, and identity. Cambridge, MA: Cambridge University Press.

Blogpost #3

In this blogpost I identify the educational technology area / topic I am most interested in exploring and why. I will share what I know about this topic and how this technology can enhance your current and future leadership position.

Educational Technology area / topic: I’m interested in exploring Learning Analytics because I may be asked to write grants to fund the implementation of this technology. As a result of this preliminary review of Learning Analytics, I’ve discovered Academic Analytics which is also of interest. Both are briefly discussed and defined in this blogpost.

Learning Analytics

Learning Analytics is one type of big data. Driven by the explosion of available data from internet, computers and mobile device and applications (apps) users leaves a digital footprint (Long & Siemens, 2011). The quantity, speed, scale and types of this digital data calls for methods beyond linear analysis used up to this point. The McKinsey Global Institute (a business research group) defines big data as “datasets whose size is beyond the ability of typical database software tools to capture, store, manage and analyze” (Manyika, 2011).

In higher education, digital student records capture student test scores and final grades. Student activity streams are captured by student cards, sensors, and mobile devices. In addition, the development of learning management systems manage and monitor student progress in online courses. (Long & Siemens, 2011). Learning analytics is ‘the measurement, collection, analysis and reporting of data about learnings and their contexts for the purposes of understanding and optimizing learning and the environments in which it occurs” (Long & Siemens, 2011).

Academic Analytics

A related concept, Academic analytics is the application of business intelligence concepts in the educational setting at the institutional, regional, national and international levels (Long & Siemens, 2011). Academic analytics “…could be thought of as the practice of mining institutional data to produce ‘actionable intelligence’’ (Campbell, DeBlois, & Oblinger, 2007).

 Blog#3 Long & Siemens Table 1

Why is it important?

Analytics or the use of learner produced data is potentially transformative – improving teaching learning organizational efficiency and decision making and thereby support systemic change (Long & Siemens, 2011). Analytics tools can help students and instructors “better understand the learning process and take action to improve course outcomes” (Educause, 2011).


Campbell, J. P., & Oblinger, D. G. (2007). “Academic Analytics: A new tool for a new era” [HTML Document]. Retrieved from:

Educause (2011). 7 things you should know about … First Generation Learning Analytics [HTML Document] Retrieved from:

Long,  P., & Siemens, G. (2011). Penetrating the fog: Analytics in learning and education [HTML Document]. Retrieved from

Manyika, J. (2011). Big Data: The next frontier for innovation, competition and productivity, McKinsey Global Institute [HTML Document]. Retrieved from: