5-year information Technology Strategic Plan Version 0 May, 2009 Version 0 June, 2010 Ted BrodheimSECTION 11: EVALUATION AND ASSESSING NEW TECHNOLOGY
Figure 11-1: Selection Matrix The evaluation process that takes place prior to technology deployment is both rigorous and well defined. The same cannot be said for technology evaluation once the technology is deployed in the field. DIIT’s vision for evaluating new technology includes a new process that allows DIIT to monitor the progress that educators at schools and libraries are making with technology after it is introduced, allowing DIIT to evaluate the utility of new technology and understand how it impacts student achievement. Goal and Strategy to Obtain VisionDIIT’s goal is to incorporate user evaluation as an integral component of every new technology rollout. A requirement to include user evaluation would be part of each new RFP (Request for Proposal). The methodology for evaluation would vary from product to product, and be developed jointly by vendor and NYCDOE staff. Current StateCurrently, our understanding of how effective technology is after it is deployed is largely anecdotal. DIIT staff often works with users, answering questions about technology, and in the course of this interaction gets a better understanding of how the technology is used at schools. However, there is no formal process to evaluate technology and address problems. Some technologies are rarely used after they have been deployed. An evaluative process as the technology is rolled out, might allow the NYCDOE to address aspects of the technology that later prove to be problematic. Target StateThe target state is an environment in which every technology is periodically evaluated by users, and these evaluations are used to accelerate, improve or curtail the deployment of the technology in the schools. It is in the DOE’s best interest to ensure that technology that users’ value, and that improves student performance, is widely deployed. Recommendations and RoadmapSuch a process starts with meetings between vendor staff, engineering staff at DIIT and members of the user community, to identify the desired goals. Understanding what users hope to gain from the new technology is a necessary step for deciding if the introduction of such technology is successful. Once the criteria for success are understood, engineers and users should agree on a methodology for evaluation. This will vary by technology. In some cases, the methodology may require the use of an evaluation form or questionnaire that will be completed periodically by school staff using the technology. In others cases, it may require nothing more than monitoring the network to quantify variables such as bandwidth utilization.
The periodic review of these results by DIIT staff, along with follow-up meetings between users and engineers, can allow technology deployment to follow a number of different paths, each of which is valuable. One might be to accelerate deployment of technology that quickly proves to be very valuable, making the technology available to a larger base of users. A second is to work with the vendor to incorporate new features that users realize would enhance the utility of the technology. Another is to identify problems or limitations of the technology that went undiscovered during the initial testing. The technology environment found in NYCDOE schools is so diverse, that technology that often works well in one school will perform poorly in another. Sometimes, only experience with the technology in diverse school settings can reveal what these problems are. Finally, one possible outcome is to realize that the technology is not accomplishing what it was intended to do, and that plans for deployment should be curtailed. This will be crucial information for the NYCDOE. This Quality Assurance process, completed at regular intervals after technology deployment, should be an integral component of all NYCDOE’s technology plans. Only by being open to all possible outcomes, will new technology be deployed in an optimal way at the NYCDOE. Benefits and Impact if Not ImplementedIncorporating user evaluations in the technology deployment process will ensure that the NYCDOE is deploying technology and applications that best meet user needs. It allows the NYCDOE to expand the role of the most useful technologies, and curtail the deployment of technologies and applications that are not embraced by the user community. Only by canvassing users to understand how they use technology can the NYCDOE be sure that its investment in technology is well spent. If this is not done, the NYCDOE risks investing in technologies that are inefficient and not widely used. Budget to ImplementThe budget for user technology evaluations will be incorporated into vendor pricing as part of the RFP process. |
|
3-Year Plan |
Total | ||
Section: |
Year 1 |
Year 2 |
Year 3 |
(In mil.) |
Section 4: Upgrading school infrastructure models for 300 schools |
27.00 |
27.00 |
27.00 |
81.00 |
Section 5: Data Center Services/Unified Storage |
78.00 |
42.00 |
30.00 |
150.00 |
Section 6: Deploy Unified Communications/Collaboration solutions to 300 schools |
14.25 |
7.25 |
7.25 |
28.75 |
Section 6: Deploy Next Generation Wireless |
8.00 |
8.00 |
8.00 |
24.00 |
Section 7: Deploy user devices to 600 schools |
76.00 |
69.00 |
69.00 |
214.00 |
Section 8: Information Security and Identity |
1.54 |
1.54 |
1.54 |
4.62 |
Section 9: Learning Management Systems |
2.40 |
2.40 |
2.40 |
7.20 |
Section 10: Enhancements to Network Operations Center (NOC) |
4.50 |
6.25 |
5.00 |
15.75 |
Total |
211.69 |
163.44 |
150.19 |
525.32 |
These are summary results; detailed breakdowns of these figures appear in the individual sections.