Skip navigation
All Places > System Administrator Community > Blog

I am pleased to share today what we're planning for the future and appreciate all of the feedback regarding the current functionality with the New Box View experience. When Box acquired Crocodoc, we believed that Box would continue to deliver functionality and enhancements for these critical teaching workflows, but that has not been the case. As many of you know, we worked diligently with Box in an attempt to get better results.  Based on the lack of responsiveness, quality control issues, and with your feedback and input (which accelerated our thinking on how we must improve your experience in this area), we have decided to replace New Box View and move forward with a new solution that is more in our control in order to better meet your current and evolving needs. 


I also wanted to share a bit more about the timing of this message. In the last couple of months, we were trying to work with Box and were in the middle of negotiating the immediate partnership agreement which is why we did not share as much as we would have liked. At the same time we were beginning technical diligence while building out more robust requirements with your feedback. The list of enhancements gathered on the Community site have been a beacon throughout the process for what needs to be included in the future workflow for both Original and Ultra.  We need to not only deliver the core features you appreciated with Crocodoc but also add innovative functionality as well.  We are currently continuing the technical diligence, gathering your feedback on requirements and prioritization, and deciding upon the best architecture and solution that will provide the ability to be dynamic and innovative as your needs change over time. To be honest, this is not easy work to get right, and we need this analysis done in order to share a more definitive timeline.  We will share more details on timing and functionality in mid-June, and then again during BbWorld.  As we work towards the release, we will be providing regular updates on the progress as well as targeted outreach for feedback opportunities, so you'll be hearing from me or others working on the project for input.


So as you can see, we’re set on moving in a new direction with this critical capability.  As we go through this process, we are not going to stop working with Box to push for innovation in New Box View based on your needs.  While we can’t predict how successful this will be, we understand that you’ll being using New Box View for Inline Grading in the near term, so we want to make sure this experience is as good as it can possibly be.


I want to say a big THANK YOU for your ongoing engagement on the Community site and elsewhere.  We know that this has been a challenging topic, but I hope this update, and our updates going forward, will help give you clarity and confidence about the improvements coming in this important area.

About: Internet Explorer and Blackboard Learn don't play nice together. This article explains how to create a warning that will appear only for users who attempt to log in to your Blackboard environment using Internet Explorer.


What's the Problem?

For the better part of my career as an LMS administrator, I've told anyone who would listen that Internet Exploder (Explorer) is good for only one thing; downloading a different browser. While I've seen many people abandon the little blue "e" for a more modern browser, there remains a coterie of diehard IE adherents who simply refuse to switch.


In their defense, there are plenty of reasons folks may still be on IE. Some are running old machines. Some lack (or think they lack) the technical knowledge needed to update their browser. Others are comfortable with the way things are and just don't want to change. Whatever the reason, (and I'm sure the reason is a very good one), it's time for a change. It's 2019. Microsoft officially ended support for Internet Explorer three years ago, and the old blue "e" just ain't what it used to be.


Why Now?


Deep in the heart of the Learn SaaS v3500.0.3 Flexible Deployment release (roughly equivalent to v3500.9.0 for Continuous Delivery), something changed, rendering useless some functionality for Internet Explorer users. Since the release of 3500.0.3, we've received calls from frustrated IE users for whom the platform no longer functioned properly and we had to break the bad news to them that the only way to fix their problem was to stop using Internet Explorer.


But it's far better to catch these users before they log in to the system so that they aren't in the middle of something when they realize that there is an issue.


To help minimize frustration for our end users, we created an "Incompatible Browser" warning that appears on the login page. We didn't want to do was use the login page announcements area (because these messages cycle through) or display the warning to users who were already on a compatible browser.


So we had to come up with a different solution...

NOTE: If you're working with a single sign-on portal that bypasses the Blackboard login page, you might as well stop reading here, contact your portal admin and ask for their help in creating an appropriate intervention for IE users.


Danger, Will Robinson! Danger!


Still with me? Cool.


Our solution to this issue was to create a warning. It's only a warning. It doesn't force anyone to do anything, but it provides users with critical information based upon their system configuration, explains the problem and provides instructions on how to correct the issue.


The warning is designed so it will only be seen by IE users but will not prevent them from logging in. It is also configured so that it cannot be dismissed by the end user.


Since we're writing code for old browsers, we want to keep it as simple as possible and avoid using JavaScript or other elements that the browser might not recognize or render properly. So, the simplest solution is one that is 100% CSS and HTML.

CAVEAT: Each institution's login page is configured differently. Before beginning, make sure you have a backup copy of your login page downloaded to a local machine. When you go to test your changes, make sure you're logged in to your environment in a separate browser from the one through which you're uploading the updated .jsp file. This way, if you accidentally insert code that breaks your login page, you can still change it back to the previous version.

To Catch IE (or, the part you want to go to if you skipped all that intro text)


To ensure that you get all the IE users, you have to approach it from two different directions, because in the world of Internet Explorer, there are two different technologies. Newer versions of Internet Explorer (10 and up), can be identified by using a media query. Old versions of Internet Explorer (9 and earlier) can be detected with a conditional comment (something that was abandoned after version 9).


First Things First

Both technologies will display the same warning. So we're creating a <div> element and making it visible when certain conditions are met. Before we worry about how to make it visible, we need to first create the element and hide it. To do this, I opened the login.jsp file and found the CSS section. After the main components were configured, I added the <div> element and set it to be hidden by default. It looks like this...


#IEMessage {

display: none;




Media Queries

I mentioned that there were two different ways we had to approach this. The first is the more modern media query method to detect IE10 and higher. I'm not going to go into what media queries are (you can do that research on your own), but suffice it to say that it's a little piece of CSS that says to the browser, "Hey! If you can interpret this code, please use these parameters. Otherwise, ignore it." In this case, all browsers EXCEPT for Internet Explorer will look at the code and be like "I have no idea what you're talking about. I'm ignoring you." You can add this at any point in the CSS, because CSS isn't compiled code and it isn't rendered in any particular order.


The @media all and (-ms-high-contrast: active), (-ms-high-contrast: none)media query tells the browser to display the div ONLY IF the stated conditions are met, and since -ms-high-contrast is a setting that only pertains to IE 10+, it will be ignored by everything except IE10+. The most important setting is display: inline-block; part because that's what tells the browser to make it visible (there are options other than inline-block, but I'll let you research that on your own). The rest is purely cosmetic.



@media all and (-ms-high-contrast: active), (-ms-high-contrast: none) {

#IEMsesage {

border: 4px outset #EAEAEA;

display: inline-block;

margin: auto;

position: relative;

text-align: center;

width: 70%;

max-width: 600px;

background-color: #FAEBD7;

color: #CCCCCC;




Conditional Comments

Conditional Comments only work on IE9 and older versions. Conditional comments are statements that go in the HTML part of the page (not the CSS) and should immediately follow any <meta> tags. The conditional comment, similar to the media query, says to the browser "Hey! If you can read this, do this. Otherwise, ignore it." To take your CSS from the media query and put it into the conditional comment, just paste everything from the #IEMessage div into the space between <style> and </style> tags.



<!--[if lte IE 9]>







Add the <div> to the <html>

The HTML area of the login.jsp file lays out exactly how the elements on your homepage will be displayed. In this part, you WILL need to worry about positioning because these are the content and formatting information that is passed to the browser.


In our case, I added the warning above the login options (we have three buttons from which users select their role in our system and are then directed to the login page appropriate to their role).


This part contains the actual text of the message, and you can say whatever you want, just make sure it's properly formatted. The key is to create a <div> with the ID of "IEMessage" (or whatever you named the div in the CSS) and insert all your code.



<div id="IEMessage" style="margin-left: 5px; margin-right: 5px;">


<b><i>We have detected that you are using Internet Explorer to access Blackboard.</i></b><br><br>Please note that Internet Explorer is<b>not</b> compatible with Blackboard and may not perform as expected. To avoid errors and frustration, we

<b>strongly</b> suggest that you use a different web browser.<br><br>For best results, we recommend <a href="" target="_blank">Google Chrome</a> or <a href="" target="_blank">Mozilla Firefox</a>,<br> but any of the <a href="" target="_blank">supported browsers in this list</a> will work as well.<br><br>

If you wish to proceed using Internet Explorer, you may log in below.



Now, all you have to do is save your updated file, upload it into Blackboard and navigate to your login page using Internet Explorer. If it worked, you'll see your message displayed.




To see how it will render under various versions of Internet Explorer, you can click on the cog in the IE menu and select "F12 Developer Tools" from the dropdown. To preview the page as it would be rendered in other versions of Internet Explorer, change the document mode by clicking on the funky little document mode icon (just to the right of "Memory" in the menubar) and selecting which version of IE you'd like to render the page using.




At this point, there are only 3 more things you need to do....


  1. Tweak.
  2. Test.
  3. Repeat.


Hopefully this made some sense to someone out there.

Update April 3rd: Box reported an incident starting at 1:13 PM EDT which impacted New Box View.  Users might have seen latency and/or errors while using Inline Grading within the last few hours.  The issue has been resolved as of 3:28 PM EDT and service should be fully restored.  If you are still seeing any errors or latency, please let me know.


Update March 28th: Box is postponing tomorrow's release to ensure all of the issues we reported are addressed.  We will be working with them to get an updated timeline and will provide updates as we know more.


Update March 22: In case you weren't able to attend the 9.1 Office Hours meeting, Box is tentatively planning to release an update on March 29th.  We will be testing this release before it is made live to confirm the planned fixes as well as the critical regressions with the last release are fixed.  I'll be sharing more information next week, but wanted to give everyone advance notice of the upcoming release.


You've probably seen me responding to New Box issues but I haven't taken the time to fully introduce myself.  Some of you may know me from Cumulative Update releases, bug discussions or from our Learn Technical Preview program.  If you were an ANGEL client, you might even remember me from Tier 1 Support. I have been with Blackboard for almost 10 years and on the Learn Product Management team for the last 4.5 years.


With Trey's departure last month, I am the new lead with the New Box View integration for Inline Assignment Grading and our partnership with Box, Inc.


After researching the past few weeks, I've put together a list of the most requested features and where we need to go with the feature. Based on feedback, this will help bring New Box View in better alignment with the previous Crocodoc workflows and add additional functionality.


  1. Sidebar- Summary View- In Progress with Box and targeted for release later in the year.
  2. Editing annotations
  3. Re-usable comments/ Comment Bank
  4. Video based comments and feedback
  5. Strike-through Annotation Type
  6. Different colors for highlighting tool
  7. Different line color and drawing line thickness
  8. Indication of Comment in Highlighted Text
  9. Eraser



In addition to this list, there are navigational and display elements that need improving as well to enhance usability and flow.  While not considered new features, it's worth noting that these will also be reviewed as we drive roadmap discussions with Box.


Also, as we move forward, I will add that it's never my intention to be vague with release timelines when sharing information.  I promise to be open as much as possible with available information.  Part of the conversations we're having with Box is improving timeline communication.


Please feel free to leave any additional feedback on this post. Part of the research was reviewing Community site threads, but there is a possibility I have missed some important ideas.  I will be at the Office Hours meeting on March 19th at 12 PM ET that Edgar Gonzalez and Mark Burris host to review and discuss feedback as well.


As always, I appreciate the candid feedback and passion around improving this feature so it better fits the needs of users.


Julia Miller

Product Manager, Learn

Update: 2/22 - 7 PM ET


This is an update to our earlier communication regarding the planned rollback of New Box View to an earlier, stable release.  At this time, Box has informed us that the rollback is complete.  The Blackboard team has confirmed this.  We expect that this rollback has resolved the earlier issues reported and described here.  We are completing initial testing to validate resolution, and as always, if you find any issues have persisted, or if you have reason to believe there is a new issue, please let us know immediately using Behind the Blackboard.


Once again, we sincerely apologize for the disruption this release has caused.  We will be providing additional communication in the coming days about additional steps we will be taking with Box to help prevent future incidents from occurring.


Update: 2/22 - 6 PM ET

As we shared in our earlier communications on this topic, we’ve been in active communication with Box since these issues were reported.  Given the urgency of the situation, Box is prioritizing work to rollback to an earlier, stable release.  The Box team is assessing the details of this now, and based on their guidance, we expect this rollback to happen tonight, February 22nd, Eastern Time.


When the rollback completes, we will share another update to confirm the operation is complete.


Thank you for your patience and understanding as we continue working to resolve these issues.

FEB 22 UPDATE: Please refer to this new post for the most recent update regarding a rollback to an early release of New Box View


Original Post Below From Feb 21 ---------------


Your concerns and frustrations are definitely being heard with regards to the critical bugs introduced this week and we're actively working towards a resolution. Since I'm receiving reports of NBV bugs through various channels, I wanted to use this post as the main location to document the individual issues, shared workarounds and provide updates on timing. I do apologize for not having a timeline resolution at the moment, but Box is actively working on providing those to me after additional discussions today.


  1. Highlight Tool: When the resolution is low (below 1600*1200), the browser window is zoomed in or not maximized, the highlight tool disappears.
    1. Resolution Timeline: TBD
    2. Workaround: See silent recording here: Video Link
    3. Related Threads:
      1. new Box view update - no highlights - DRAWINGS DON'T DELETE
  2. Drawings: Once you have added a drawing, the delete tool is missing.
    1. Resolution Timeline: TBD
    2. Workaround: Submit the assignment to save changes, then go back into the attempt, expand the iframe and THEN you can select drawings to delete them.
  3. Comments: When adding a comment, the cursor doesn't automatically appears in the "write a comment" text box.
    1. Resolution Timeline: TBD
    2. Related Threads:
      1. BOX Grading Toolset Travesty
  4. No hardware stylus functions properly (thanks: Amy Eyre and Aderemi Artis) (Was “iPad Pro: Stylus input has stopped working on iPad Pro.

5. Trash Bin Icon

  1. Resolution Timeline: TBD

6. Comment Box Panel

  1. While Box categorized the change in display to be more responsive and currently functioning as designed with the long panel showing to the right, there may still be some inconsistencies in the experience. I'm reading through various threads to see if I can tease out any underlying issues.


I also wanted to re-share the document Amy put together as others are finding it helpful. Thanks Amy Eyre!

New Box View - Staff - "Small Screen" Workaround Guidance - Google Docs.


While there are still outstanding issues that need to be solved, the focus right now is tackling any new critical items found with the update this week as quickly as possible. I am compiling a list of any outstanding bugs and enhancements to start tackling next.




2/22 Update: While we don't have timelines yet, I wanted to share an update on the status for a portion of the items. The ability to delete annotations (#2) is actively being worked on by Box engineers. The missing highlight capabilities (#1) and the cursor automatically populating in the text box (#3) are still being investigated further. Regarding the trash bin icon (#5), the current behavior is expected to better match other Box behavior, however they are discussing reverting the current behavior.

Thanks everyone for all your comments and feedback to this post so far. I am reviewing all of them to capture additional bugs and feedback for Box.

A feature in Blackboard called Goals allows faculty and departments to collect information for accreditation or other purposes as to how programs and curriculum aligns with course goals.


Faculty can align course content and assessments (eg. discussion forums and threads, blogs, journals, tests and individual questions, assignments, and Grade Center columns) to one or multiple goals.

Reports can then be run that display how students are performing in alignment with the associated course or department goals.


Many reasons cause departments to seek continuous quality improvement in program and course curricula. In the School of Computing and Information Systems at GVSU, there is a need to effectively support accreditation requirements as well as to focus on the improvement of the student experience. To this end, the department designed an assessment plan that involves all faculty around the following claims:

  1. We know what we do
  2. We do it well
  3. We can prove it


A number of assessment tools can be used in collecting data and evidence in the improvement program such as:

  • Senior exit survey
  • Focus Groups
  • Internship supervisor survey
  • Comprehensive exit exam (standardized or local)
  • Student portfolios evaluated by committee
  • Faculty self-reflection about each course
  • External evaluators of student performance
  • Performance criteria within each course


As part of the program faculty collect samples of student work. This may include making copies of each assignment and exam with representation for student high, average, and low performance. The criteria for student performance should be measurable, defined by observable behavior, and identifying a specific standard (or minimum standard).

The Blackboard Goals feature allows instructors to mark assessments or content in Blackboard Learn courses as performance criteria to reach a curriculum goal. While the curriculum goal does not have to be the same as a course learning objective, they often overlap. In turn, Blackboard Learn collects student scores and compares them to set target performance level and average range. This helps in identifying students who meet expectations or are outside of them.


The resulting Blackboard course performance report summarizes how students met the expected performance criteria against one or more set goals. The course itself can be archived or copied to retain assessment artifacts and assignment samples. The report is granular enough to provide a breakdown of academic performance per student on each assessment and curriculum goal. The following sample reports come from Introduction to Computing, CIS 150.





Overall, the Goals feature in Blackboard Learn is a very useful tool for collecting academic data and including it in a larger, final report submitted for each course, which includes faculty reflections, student evaluation samples, assessment of previous adjustments, and proposed changes for the future.


Thanks to Eric Kunnen for his ongoing support of key instructional projects.


How to Run a Goals Report - YouTube

Original post at GVSU


Thanks Chris Bray for the Blackboard Learn Goals XML Generator!


Here are the details:

Generates XML files that can be imported into Blackboard Learn for use with Goals, based on the documentation at



Example Files:


Bb Grader Announcement

Posted by eschramke Jan 25, 2019

Effective March 29, 2019 the Blackboard BbGrader App will no longer be supported.   

Bb Grader will be removed from the Apple App Store and support will cease to service on March 29th, 2019 in all markets.  This discontinuation furthers the progress we’ve made across our product portfolio to simplify, consolidate, and promote our next generation solutions. 


As a part of our commitment to delivering a portfolio of high-quality products to help our clients thrive in a complex and changing educational environment, we’ve planned a robust roadmap that continues to build upon the grading capabilities already available in Blackboard Instructor to provide instructors the best classroom experience while on the go.


To learn more about grading in the Blackboard Instructor app, please visit


Calling All System Admins!

Posted by stolleb Jan 24, 2019

Have your say in the evolution of the System Admin Panel in Blackboard Learn Ultra!


As part of our continued commitment to research, design and iteration in Learn Ultra, we want to hear more about how you currently use the Administrator Tools page in Learn. What features do you use? What works well--or doesn't? What would you change? We'd appreciate your input, even if you don't currently use the Ultra experience.


Help shape the tools you use to deliver and support Blackboard Learn at your institution by filling out this survey: Blackboard Admin Panel Survey (The survey should take approximately 10 minutes to complete.)


Thanks for helping inform the future of Blackboard!

We're happy to announce a new feature in BbStats: Latent Class Analysis with Annual adoption report.

Some time ago we reported about a new study at the University of Illinois at Chicago about Patterns in Faculty Learning Management System Use . This research was featured at the DevCon in 2017 and in Top 7 findings in the new study on Blackboard usage .


Running your own Latent Class Analysis study may be of interest to you, but likely many priorities are competing for your time. By the way, Latent Class Analysis is a statistical method for identifying unmeasured class membership among subjects using categorical and/or continuous observed variables. So, while we can't run a custom study for you, it is now possible to identify latent groups already documented. BbStats will graph out how many of your courses belong to each latent group. It doesn't do that by running the model itself, you would do that in order to discover new latent groups or groups unique to your organization. So, as long as you accept the findings of the UIC study, BbStats will identify three documented groups in your data.


Latent group analysis is a process of grouping data to discover new patterns. In a way, the data itself speaks to you through the emerging patterns. Extracting the course design data from Blackboard with BbStats and running the statistical model at UIC identified the following groups: Holistic, Complementary, and Content Repository.


Course data was extracted from 2562 courses with 98,381 student enrollments during the Fall of 2016. A latent class analysis was conducted to identify the patterns of LMS tool use based on the presence of grade center columns, announcements, assignments, discussion boards, and assessments within each course. Three latent classes of courses were identified and characterized as Holistic tool use (28% of the courses), Complementary tool use (51%), and Content repository (21%).


Following is the process to replicate the study report in your Blackboard Learn system:


1. Count all courses with an active grade center. According to the study, this includes Holistic and Complementary courses, but excludes Content Repository courses.

2. Count all courses with an active grade center, announcements, and discussion forums, which identify the Holistic group, with very close approximation.

3. Identify all courses accessed by students, which indicate courses that were active at some point in history.


With the above three groups identified, the system makes the following calculations:


Holistic courses = All courses with an active grade center, announcements, and discussion forums.

Complementary courses = All courses with an active grade center minus Holistic courses

Content Repository = All courses accessed by students minus all courses with an active grade center


This figure, taken from the UIC study, shows the basis for the above calculations:

Screen Shot 2018-09-08 at 5.12.11 PM.png

There are two ways of running the Latent Class Analysis report in BbStats.


1. You can specify a pattern for the course_id to investigate course groups or departments. This may be different for your school (Figure 2). Examples:


Show latent groups for the Fall 2018 courses for the Computer Information Systems department:



Show latent groups for calendar year 2016:



Identify latent group for a specific course:



2. The second way of using the new report is to run the Annual Latent Class Analysis. This will show academic year breakdown based on course creation date from March 1, 2013 to March 1, 2018. The assumption is that courses are created prior to the Spring/Summer term after March 1. This report does not require any course_id patterns (Figure 3).


Figure 2.

Screen Shot 2018-11-01 at 7.18.05 PM.png


Figure 3.

Screen Shot 2018-11-01 at 7.17.36 PM.png


Try it in your staging system. OCELOT: BbStats.

Download the Springer journal article:



The system expects that courses are created inactive and instructors enable courses for students to see. This means that if students access courses, these courses are active. Also, any new gradebook column counts as usage of the gradebook. So, if all new courses are pre-populated with gradebook columns, the complementary category will be very high. However, if faculty create their own gradebook columns by copying courses and if inactive courses are not available to students, the graphs should be accurate.

(this post is a continuation from my prior post in September 2018)


Time for another update on this topic! This month's update will be short but juicy, so let's dive in:


Addition of 'Comment Summary' to Annotated PDF Downloads

When we rolled out the 'download PDFs with annotations' capability, one of the pieces of feedback we received was that the point- or comment-based annotations had to be expanded one at a time before they could be printed for physical review. Also, expanding the comments within the PDF - particularly longer comments or comments with lots of text - could potentially cover up or block the content on the page making it more difficult to view the content that the annotation was associated with.


To help alleviate this, on 10/23/2018 we implemented new functionality that adds a 'Comment Summary' section to the end of annotated PDFs when downloaded from New Box View. This new feature does a few things at once:


  • Adds numbered labels to comments based on location within the document, with numbering starting at the top comment
  • Adds a 'Comment Summary' section to the end of the PDF
  • Lists comments based on (1) page number and (2) comment number


Here are some screenshots of the functionality in action:


Screen Shot 2018-10-23 at 5.14.22 PM.png

Note the three Point Comments in New Box View


Screen Shot 2018-10-23 at 5.14.32 PM.png

Same three comments numbered in annotated PDF download


Screen Shot 2018-10-23 at 5.14.51 PM.png

Comment Summary with page number and numbered comments listed


This new functionality has been implemented within the microservice that supports this feature, so no change or update to Learn is required to access this feature. Go check it out now!


Note: this is separate and unrelated to the 'Summary View' feature that has been requested that presents an aggregated view of annotations while viewing files online and within New Box View; this new summary of comments is only for downloaded PDFs containing annotations.


Box Deprecation of TLS 1.0 and Supported Web Browsers

Technology companies across Internet have been broadly deprecating support for the older TLS 1.0 standard (e.g. and Box is following suit. Here is the link to the Box information on the change. There are two relevant points for Blackboard clients:


Supported Browsers

The Box and Blackboard matrices for browsers supporting the newer TLS 1.1 standard are available here: Box Supported Browsers and Blackboard Supported Browsers. Directly comparing the matrices:


Screen Shot 2018-10-26 at 2.27.43 PM.png

Box supported browser versions


Screen Shot 2018-10-26 at 2.27.50 PM.png

Blackboard Learn supported browser versions


The main point is that the oldest browser versions supported by Blackboard are significantly newer than the oldest browsers that support at least TLS 1.1, so there should not be any issue for users on Blackboard-supported versions of browsers.


Timing of Deprecation

Box originally announced end of support for TLS 1.0 on 6/25/2018 and completely deprecate TLS 1.0 calls on 11/12/2018. This means users need to be on a browser supporting TLS 1.1 before 11/12 or they will not be able to use New Box View within Learn.


Note that Blackboard removed support for TLS 1.0 and 1.1 for Learn SaaS customers on 8/31/2018.


Thank you for your continued patience and support as Blackboard and Box continue to improve this area of functionality!

The top findings from the Patterns in Faculty Learning Management System Use | SpringerLink research study at University of Illinois at Chicago.Screen Shot 2018-09-23 at 3.04.16 PM.png


1. On-line courses at UIC focus on holistic use of LMS tools.  68.3% of hybrid/elearning courses, as opposed to in-person courses, were in this latent analysis group and used five key tools: content items, grade center, announcements, discussions, and digital assessments.  Only 22.5% of in-person courses were found in this group.  3% of hybrid/elearning courses were in content repository group.


2. 53.7% of in-person courses at UIC were in complimentary usage group.  This means they used three main tools: grade center, announcements, and assignments.   In-person courses split 22.5% as holistic and 23.7% as content repository courses.  Content repository courses used content items and announcements (without the use of the grade center, assignments, or digital assessments).


3. Holistic group of courses had courses of larger class sizes and greater likelihood of on-line delivery.


4. Comparing the student use of time in digital content of courses and faculty design intentions, there is clearly a gap.  Perhaps time spent on course items by students reflects their best judgment on what will make them successful in the course. Faculty may be designing opportunities for students, which are not well communicated and utilized. Further research is needed to bridge this gap and match student digital behavior with faculty expectations and their design for learning.


5. The aggregate profiles of courses by school or college often reflect a general nature of the programs and curricular approach.  The adoption of specific tools in the digital portion of a course should not be correlated to academic quality of the program or effectiveness of instruction.  This approach reports only on the selection of tools in Blackboard Learn portion of the course design. However, this presentation of the results may suggest resources that may be needed by specific colleges, such as assigned instructional designers or instructor training sessions in specific Blackboard tool use. Finally, as the body of knowledge about the Scholarship of Teaching and Learning (SoTL) continues to increase, exploring the tool use in the local learning management system may help to distribute SoTL findings to instructors and colleges according to the digital evidence of their course design (Englund, Olofsson, & Price, 2017; Openo et al, 2017).


6. The findings in this study need to be related to the patterns, which were identified previously by measuring the student use of courses across a large data set (Whitmer et al., 2016). That study did not report on the faculty intent, in terms of the design of the course, but on the time students spent consuming content. The present study adds that while students may be spending a large amount of time in the content of the course, at least 28% of courses at UIC were created with well-rounded opportunities for students to engage in assessments, discussions, announcements, assignments, and reviewing grade postings.  This affects approximately 31,417 student course enrollments (706 courses with 44.5 enrollments on average out of the total of 2,562 courses with an average of 38.4 student course enrollments). The definition of the Complementary group in the study by Whitmer (2016) included content with announcements and the use of gradebook.  Our study identifies a different course design profile as Complementary tool use.  It includes digital assignments for roughly half of the courses.  Perhaps the time required to complete the assignments cannot be recorded in the student activity data; however, there is a clear intention on the side of faculty for students to submit their assignments through the LMS.  Along with the Holistic tool use profile of courses, they make up 79% of courses at UIC.


7. The use of the Blackboard Learn system as a “Content repository” makes up only 21% of the system, 3% of hybrid/elearning courses.  This latent class profile, content repository, may represent initial phases of a faculty member digitizing a course experience.  It may represent the view of the role of technology in teaching as faculty-to-student communication and content-to-student communication.  Certainly, this intent by faculty does not tap into student-to-student communication, digital collaboration between either faculty and students (assignments) or student-to-student collaboration (groups, discussions), or digital assessment (quizzes or exams) in the  system.  It may be that these teaching and learning dimensions are facilitated in the classroom or in other systems.


Screen Shot 2018-09-08 at 5.12.23 PM.png

Screen Shot 2018-09-08 at 5.12.11 PM.png




Patterns in Faculty Learning Management System Use

ResearchGate: Patterns in Faculty Learning Management System Use

Springer Nature Reader



Machajewski, S., Steffen, A., Romero Fuerte, E., & Rivera, E. (2018). Patterns in Faculty Learning Management System Use. TechTrends.

Last week we had an email avalanche when a teacher sent an email by a personal learning desiger rule.
The idea was to send a personalized email to every student.
The rule was simple: send an email at a specific date and time.
The email contained the token ((student_firstname)) and was sent to all users in the course with the role student.

This apparently created 400 emails, one for every student, that were all sent to every student, making 160000 emails. Expected behaviour was: send one email to every student.

We now realize that he should probably have sent the email to 'Triggering user', but it's not evidently clear that including the token is actually a trigger. Or is it? Maybe you can clarify this for us.

It would be nice to have a preview feature for PLD, maybe even a warning when a user tries does something that would trigger so many emails: "You will be sending 160000 emails! Do you really want to do that?"

(this post is a continuation update from my prior post in July 2018)


9/28 Edit: updated expected CU delivery dates

9/14 Edit: updated Excel file annotations info

10/16 Edit: updated delivered CU dates and links


It's been ~six weeks since my last update and there has been some new developments on multiple fronts that I thought I'd share:


Excel-based File Annotations

Box has informed Blackboard that today, September 13th, they will be enabling annotations capabilities for Excel-based file formats (.xls, .xlsx, etc.).  Once available, this functionality will represent another improvement to New Box View and another gap closed against the former Crocodoc solution.


The exact time this functionality will be available is not known, but I will keep checking and update this post once confirmed!

Excel files can now be annotated!  Go try it yourself!


Screen Shot 2018-09-14 at 4.20.35 PM.png


Download with Annotations in Learn Original Bug Fix Availability

Since the release of the Download with Annotations capability, a bug was identified that prevents the download of annotated documents when using the 'My Grades' workflow.  We've identified the underlying issue and have a fix ready, but the fix must be applied to Learn instances (not within a microservice).  The next round of planned Cumulative Updates for Learn will provide this fix, and are currently targeted for the following dates:



These dates are still targets and could move, but this is our expected timing based on what we know right now.  I'll update this post as the CUs become available.


In the interim, the workaround for this issue is to use the Download with Annotations capability by accessing the attempt through the Content -> Assignment -> Attempt workflow.


Accessibility Update

Box has made internal progress on improve the accessibility of New Box View: the main content and navigation UI has been accessible for quite a while, and the annotations have been the next priority and progress has been made.  However, the improvements will not be available until the general annotations features make their way into the main Box content management application, and I don't yet have a timeframe from Box on that action.  However given the development cadence and agile methodology that Box follows, movement and communication from Box means that we're getting closer!  As soon as I have more information, I'll share it via an update to this post or a new blog post.


I've also updated the issue tracking spreadsheet this month as well.

I was going to start a discussion and solicit feedback, then I realized I could just as easily create a blog post and have the discussion in the comments section.   So here we are.


What's this about?


Part of staying on top of my work is keeping up with Known Issues in Blackboard.  A lot of times, I don't have time to log in to BtBb and look things up, so what I'll do is keep a copy of the Known Issues spreadsheet on my workstation for quick reference. This blog post explains the hows and the whys, and also goes into what I do with the data once I've got it.


Getting the Data from BtBb


Log in to Behind the Blackboard and click on the link to Known Issues.btbb_platform_select.png

Once the Known Issues page is loaded, you'll want to narrow down the results a bit.  To do this, click the "Release" dropdown and select your environment.


NOTE:  There are only two options available, 9.1 and SaaS. If you don't know which of these is applicable to your institution, you probably shouldn't be logging into BtBb in the first place.


So as the image to the right will attest, I've selected my release (SaaS), and my Product (Learn) and Article Type (Known Issues) were selected by default when I clicked on the "Known Issues" link from the BtBb main page.



export_to_excel.pngSo once I've selected my Release, I click on  "Export to Excel" to download the data.  (The "Export to Excel" icon is on the upper region of the page, on the right-hand side, hidden in plain sight, as it were.)


Your browser will download the file, and depending on your browser and system configuration, you may or may not be prompted to do something. My machine is so accustomed to me doing this that it doesn't even bother me when I go to download pretty much anything from Bb.


We've got data, 'cause we've got a band.1


Now that the XLS file has been downloaded to your workstation, you're ready to get to work and do all sorts of magical things.   So locate that file on your machine and open it in Excel.


excel_btbb_error.pngWhen you try to open the file, Excel will return an error, indicating that the file format and extension don't match and asks if you want to open it anyway.  Click "yes" to ignore the error and open the file.



When you open the file (against Excel's better judgement) something weird happens. 
There's data that resembles a spreadsheet but what in tarnation is all that gobbledegook in the very first cell?btbb_export_01.png


That, my friends, is something they didn't teach you at Monument University.

When you download the .XLS file, there's a surprise hidden inside.


The spreadsheet from BtBb is NOT an XLS file at all, but rather an HTML document with a .XLS extension.  Excel can open it and present the data because it’s HTML and it’s laid out in tables, so Excel says “oh.  Ok.  I can figure this out.” and it goes from there.  But because it’s not “real” Excel data, it’s hard to work with (and that’s also why it’s always got that big ugly line of code in the top row).


To see for yourself, download the file from BtBb, change the extension to .HTM (or .HTML), then open the renamed file in your favorite text editor or browser.   It will look okay, but it’s still not quite right.  If you copy and paste that text into Excel, you’ll have a lot of blank rows and your sorts will be wonky as a result.


The next section shows here’s how to fix that. 


Making Sense of the Weird Exported Data


Open the HTML document in a text editor (There are tons of them out there.  I'm a fan of Atom because it's cross-platform.  But I confess, if I'm on a PC, I'm partial to Notepad++).




Once you've opened the document, delete everything before the <table> tag.  There's a lot.  Most of it is in the <script>, which is actually the only thing you need to get rid of, but it's simpler just to get rid of everything rather than look for the start and end of the <script>, especially if you're not used to working with HTML code in a text editor.



Now we are left with a document with no HTML heading information.  This would be a terrible practice for actual HTML coding, but we're not doing actual HTML coding for publication, we're just cleaning up a mess.  Your browser will still read it without issue (browsers are smart like that).



If you saved the file at this point and opened it with your browser (or with Excel), you'd see that the garbage at the top was gone.   But we want to do something else before we save it.   What you can't see in the browser is that there are several <p> elements in the code that would cause Excel to render the data as a blank row.  This isn't cool.   So you want to eradicate all of these. 


So...  do a find/replace for <p>, replacing each instance of <p> with a delimiter of your choice (I like ; in this instance, but that’s just me). 


Save the file.




We're not quite done.  Now that you've saved the HTML file, open it in your web browser of choice.  Magically, all of the extraneous rows are now gone.  Now, copy all of that data in the table (the one in your browser) into a blank Excel document.  Depending on your computer, it may take a bit, as there is a lot of data there.


However, if you check out the Excel spreadsheet, you'll notice that the data is clean and much easier to work with.


So save the Excel file, and you now have a copy of the Known Issues saved locally.  YAY!!


Wrapping it up


This isn't the end, but it gives you something to work with.  The real tricky part comes when you want to make those dates something you can actually use, because at the moment, Excel is treating that data as generic text because it doesn't know any better.


If there's interest, I'll write a follow-up piece to this about how to clean up the date data so that it actually means something.


I will confess, this is a clunky, labor-intensive method, and I’ll come up with a more elegant method eventually, but for the moment, it gets the job done.









1Okay, it should be noted that while I'm writing this, I'm watching a live Phish concert (on the web, not in person), hence the oddball headers and occasionally goofy tone of the article).

All right, so you implemented the Open Photo Roster to show campus photo ids to instructors.  This helps with learning student names, managing new faculty anxiety about teaching, and allows proctors validate identities before the exam.  However, a question sometimes comes up: What about the pictures of freshmen, now that they are seniors, who don't look anything like their pictures!


You are in luck.  We implemented image manipulation to age the freshmen photos and simulate how they will look in just 4 years.




Ok, maybe not exactly ... we're not there yet.  What we added are two additional rosters, which may help faculty see more current photos.  The two rosters are Blackboard Avatars and Gravatars.


Many schools are allowing students to upload avatars to the My Institution cloud interface.  These photos can be useful, and now you can see them in the photo roster.  On the other hand, the Gravatar system is very popular in WordPress and other social media.  It allows students to associate pictures with their school email address.  Instructors can specifically encourage the upload of photos to Bb Avatars or Gravatars to make sure their rosters are complete.


The new B2 has two versions: Open Photo Roster, and Open Photo Roster Plus.  The Open Photo Roster displays Bb Avatars and Gravatars.  The Plus requires custom setup arranged by  The B2s were tested on local installation, Managed Hosting, and Saas.  They work with Original and Ultra courses (Ultra menu setup arranged by