Pages

Monday, April 22, 2013

SharePoint Solution Test Plan


SharePoint Solution Test Plan

Each solution should have its own defined test plan. What follows is an example of a SharePoint test plan. This plan is very basic and is only meant to introduce ideas.

Objective

To test the new WebPart XYZ for [Organization Name] based on the Requirements Traceability Matrix provided prior to the inception of this portal project. In this Test Plan, [Organization Name] expects to achieve the following:
1.       Ensure that all business requirements have been met.
2.       Ensure that all functional requirements have been met.
3.       Ensure that branding/graphical elements render correctly in our chosen browsers.
4.       Evaluate system performance.
5.       Determine level of user satisfaction.
The above objectives will be accomplished by using the use cases outlined in subsequent sections of this document.

Schedules

The following schedule outlines testing roles, timelines, and dates. Each group is responsible for adhering to the following schedule. Each segment detailed will be completed within the scheduled timeframe.
User Role
Dates
Timeframe
Administrators
Sunday, May 2 – Wednesday, May 5
5pm – 5am
Content Stewards
Monday, May 3 – Friday, May 7
7am – 5pm
End Users
Monday, May 10 – Friday, May 14
7am – 5pm

Responsibilities

Administrators:                Site owners, manage security, content, design, maintenance, users and navigators of the site.
Content Stewards:          People who will administrate sites at the departmental level, also users and navigators of the site.
Users:                                   People who will read, navigate, and/or contribute content to the portal.

Test Cases

What follows is a matrix of the test cases that will be performed for this solution.
 User Role
Test Description (Intention)
How to Test
Result (System Response)
Administrators
Minimize and restore the Web Part on a page.
  1. From a page, in the ribbon, click the Page tab, and then click the Edit command.
  1. On the page, point to the Web Part, click the down arrow, and then click Minimize.
  1. When you have finished editing the page, click the Page tab, and then click Save & Close.
  2. From that same page, in the ribbon, click the Page tab, and then click the Edit command.
  1. On the page, point to the Web Part, click the down arrow, and then click Restore.
  1. When you have finished editing the page, click the Page tab, and then click Save & Close.
Tester documents results here.
Content Stewards
Add Web Part to page.

  1. From a page, in the ribbon, click the Page tab, and then click the Edit command.
  2. Click on the page where you want to add a Web Part, click the Insert tab, and then click Web Part.
  3.  Under Categories, select a category, such as Lists and Libraries, select the Web Part that you want to add to the page, such as Announcements, and then click Add.
  4.  When you select a Web Part, information about the Web Part is displayed in About the Web Part.
  5.  When you have finished editing the page, click the Page tab, and then click Save  Close.

Tester documents results here.
End Users
View Web Part on page.
1.       Navigate to Web Part page.
Tester documents results here.

Task Checklist for Testing Web Parts

The following sample SharePoint specific checklist contains a series of tasks designed to help you determine the quality of Web Parts you are asked to deploy or maintain.  This can be used in addition to the tests, or it can be used as a simplified alternative depending on your organizational needs.

Task Checklist

Task

Verify that you can add the Web Part properly to a Web Part zone.
Verify that static Web Parts render appropriately and do not cause the Web Part Page to fail.
Verify that the Web Part works correctly regardless of where the Web Part Page is located.
Verify that every public property can handle bad input.
Verify that the Web Part handles all of its exceptions.
Verify that the Web Part renders correctly in SharePoint Designer.
Verify that Web Part properties displayed in the tool pane are user-friendly.
Verify that the Web Part appears appropriately in the web part gallery.
Verify that the Web Part previews properly (through the web part gallery).
Verify that you can import and export the Web Part properly.
Verify that Web Part properties are not dependent on each other.
Verify that Web Parts work correctly with different combinations of Web Part zone settings.
Verify that the Web Part renders appropriately based on user permissions.
Verify that adding several instances of the same Web Part to a Web Part Page (or in the same Web Part zone) works correctly.

 

You Are Not Done Yet - checklist


You Are Not Done Yet- checklist

Pick something. Anything. A feature in your favorite software application, your favorite toy, your favorite piece of furniture. Now start brainstorming things you could do to test it. Think of as many different things to do to that object as you can. Come back and continue reading when you’re done.

 

What’s that? You’re back already? There are test cases you haven’t thought of, I guarantee it. How do I know? Because for even the tiniest bit of something – the Find dialog box in your web browser, say, there are billions of possible test cases. Some of them are likely to find interesting issues and some of them aren’t. Some of them we execute because we want to confirm that certain functionality works correctly. These latter cases are the basis of my You Are Not Done Yet list.

 

This list is large and can be overwhelming at first. Fear not. You have probably already covered many of these cases. Others won’t be applicable to your situation. Some may be applicable yet you will decide to pass on them for some reason or other. Verifying you have executed each of these test cases is not the point of the list. The point is to get you thinking about all of the testing you have and have not done and point out areas you meant to cover which you haven’t yet.

 

So don’t quail at the thought of all this testing you haven’t done yet. Instead, customize this list to your context. Scratch off items which do not apply. Use the list as a launch point for finding items not on it which do apply. Use it to organize your testing before you start. Use it as a last-minute checklist before you finish. How you use it is not nearly as important as that you use it in the first place.

 

Input Methods

You are not done testing yet unless...you have tested the following input methods:

Keyboard. Duh, right? But it's important to remember that testing keyboard input doesn't just mean verifying you can type into text boxes. Scour your application for every different control that accepts text - not just as a value, but also shortcut key sequences and navigation. (Yes, there's some overlap here with Dialog Box Navigation and Accessibility.) If your application uses any custom controls, pay them especial attention as they are likely to use custom keystroke processing. Make those mouse-hating keyboard wizards happy!

 

Mouse. Duh again, but again it's so obvious that it's easy to miss. And again, pay especial attention custom controls as they are likely to do custom mouse handling.

 

Pen input. Depending on your target platform(s), this could mean pen input direct to your application, filtered through the operating system (e.g., the Tablet Input Panel on Microsoft Windows), and/or filtered through third-party input panels. Each input source has its own quirks that just might collide with your application's own quirks.

 

Speech input. Depending on your target platform(s), this could mean speech input direct to your application, filtered through the operating system, and/or filtered through third-party speech processors.

 

Foreign language input. On Microsoft Windows this usually means an Input Method Editor (IME), either the one that comes with the operating system or one provided by a third party. These can be troublesome even for applications that do not do any custom keystroke processing. For example, a Japanese-language input processor likely traps all keystrokes, combines multiple keystrokes into a single Japanese character, and then sends that single character on to the application. Shortcut key sequences should bypass this extra layer of processing, but oftentimes they don't. (Note: turning off the IME is one solution to this quandary, but it is almost never the right answer!)

 

Assistive input devices such as puff tubes. The operating system generally abstracts these into a standard keyboard or mouse, but they may introduce unusual conditions your application needs to handle, such as extra-long waits between keystrokes.

 

Random other input sources. For example, I have seen games where you control the action by placing one or more sensors on your finger(s) and then thinking what you want the program to do. Some of these devices simply show up as a joystick or mouse. What happens if someone tries to use such a device in your application?

 

Multiple keyboards and/or mice. Microsoft Windows supports multiple mice and keyboards simultaneously. You only ever get a single insertion point and mouse pointer, so you don't have to figure out how to handle multiple input streams. You may, however, need to deal with large jumps in e.g., mouse coordinates. Oh the testing fun!

 

Files

You are not done testing unless...you have looked at each and every file that makes up your application, for they are chock full of information which is often ignored. And we all know what happens when things are ignored - bugs appear! I remember one bug bash where a developer chalked up over fifty bugs simply by going through this list!

 

 Verify the version number of each file is correct.

 Verify the assembly version number of each managed assembly is correct. Generally the assembly version number and the file version number should match. They are specified via different mechanisms, however, and must explicitly be kept in sync.

 Verify the copyright information for each file is correct.

 Verify each file is digitally signed - or not, as appropriate. Verify its digital signature is correct.

 Verify each file is installed to the correct location. (Also see the Setup YANDY.)

 Verify you know the dependencies of each file. Verify each dependency is either installed by your setup or guaranteed to be on the machine.

 Check what happens when each file - and each of its dependencies - is missing.

 Check each file for recognizable words and phrases. Determine whether each word or phrase you find is something you are comfortable with your customers seeing.

 

Filenames

 

You are not done testing yet unless...you have tested the following test cases for filenames:

 Single character filenames

 Short filenames

 Long filenames

 Extra-long filenames

 Filenames using text test cases

 Filenames containing reserved words

 Just the filename (file.ext)

 The complete path to the file (c:\My\Directory\Structure\file.ext)

 A relative path into a subfolder (Sub\Folder\file.ext)

 A relative path into the current folder (.\file.ext)

 A relative path into a parent folder (..\Parent\file.ext)

 A deeply nested path (Some\Very\Very\Very\Very\Very\Deeply\Nested\File\That\You\Will\Never\Find\Again\file.ext)

 UNC network paths (\\server\share\Parent\file.ext)

 Mapped drive network paths (Z:\Parent\file.ext)

 

Filenames are interesting creatures and a common source of bugs. Microsoft Windows applications that don't guard against reserved words set themselves up for a Denial Of Service attack. Applications on any operating system that allow any old file to be opened/saved/modified leave a gaping hole onto "secured" files. Some users stuff every document they've ever created into their user folder. Other users create a unique folder for each document. Certain characters are allowed in filenames that aren't allowed elsewhere, and vice versa. Spending some focused time in this area will be well worth your while.

Filename Invalid Characters and Error Cases

You are not done testing yet unless...you have checked for invalid characters in filenames, and for reserved filenames. Operating systems tend to get grumpy if you try to use wildcards (e.g., '*') in filenames. They may also treat certain filenames specially. For example, Microsoft Windows provides a single API for creating/opening files, communication ports, and various other cross-process communication mechanisms. Well-known communication ports (e.g., COM1) are addressed by "filename" just as though they were a file - kinda handy, but it means that you can't use "COM1" for a physical file on disk.

Testing for this is easy: brainstorm a list of interesting test cases, then slap each one into each of your application's dialog boxes, command line arguments, and APIs that take a filename. Illegal characters will probably throw an error, but trying to open a reserved filename is likely to hang your app.

See the MSDN topic "Naming a file" [http://msdn.microsoft.com/library/default.asp?url=/library/en-us/fileio/fs/naming_a_file.asp] for the full skinny on reserved characters and filenames on Microsoft operating systems.

 


File Operations

You are not done testing unless...you have thoroughly tested your application's Open, Save, and Save As functionality. I don't know about you, but I get grumpy when my work disappears into thin air! For many applications, if data cannot be saved and later regurgitated with full fidelity, the application may as well not exist. Thus it is important to verify the correct thing happens under the following conditions:

Open each supported file type and version and Save As each supported file type and version. Especially important is to open from and save as the previous version of your native format. Customers tend to get grumpy if upgrading to a new version means they can no longer open old documents! And they tend to not upgrade if they do not have a simple way to share documents created in the new version of your application with those poor souls still languishing on the old version.

 

 Open each supported file type and version and Save. If the file type and version can be selected during a Save operation (as opposed to a Save As operation), Save to each supported file type and version. More usually, Save saves to the current version only.

 

 Roundtrip from each supported version to the current version and back to the previous version. Open the resulting file in that version of your application. Does it open correctly? Are new features correctly converted to something the previous version understands? How are embedded objects of previous versions handled?

 

 Open files saved in the current version of your application in previous versions of your application. If the document opens, how are features added in the new version handled? If the document does not open, is the resulting error message clear and understandable?

 

 Open from and Save and Save As to different file systems (e.g., FAT and NTFS) and protocols (e.g., local disk, UNC network share, http://). The operating system generally hides any differences between types of file systems; your application probably has different code paths for different protocols however.

 

 Open, Save, and Save As via the following mechanisms (as appropriate):

o Menu item

o Toolbar item

o Hot key (e.g. Control+S for Save)

o Most Recently Used list

o Microsoft SharePoint document library

o Context menu(s)

o The application’s Most Recently Used list

o The operating system’s Most Recently Used list

o Drag-and-drop from the file system explorer

o Drag-and-drop from your desktop

o Drag-and-drop from another application

o Command line

o Double-click a shortcut on your desktop

o Double-click a shortcut in an email or other document

o Embedded object

 

  Open from and Save and Save As to the following locations:

o Writable files

o Read-only files

o Files to which you do not have access (e.g., files whose security is set such that you cannot access them)

o Writable folders

o Read-only folders

o Folders to which you do not have access

o Floppy drive

o Hard drive

o Removable drive

o USB drive

o CD-ROM

o CD-RW

o DVD-ROM

o DVD-RW

 

   Open from and Save and Save As to various types and speeds of network connections. Dial-up and even broadband has different characteristics than that blazing fast one hundred gigabyte network your office provides!

 Open files created on (and Save and Save As to as appropriate):

o A different operating system

o An OS using a different system locale

o An OS using a different user locale

o A different language version of your application

 Open from and Save and Save and Save As to filenames containing

o The Text Entry Field YANDY list, as appropriate

o The Filenames YANDY list, as appropriate

o The Invalid Filenames YANDY list

o Spaces

 

 Cause the following to occur during Open, Save, and Save As operations:

o Drop all network connections

o Fail over to a different network connection

o Reboot the application

o Reboot the machine

o Sleep the machine

o Hibernate the machine

 

 Put AutoSave through its paces. What happens when you AutoSave every zero minutes? Every minute? With a very big document? If the AutoSave timer is per document, what happens when multiple AutoSaves kick off simultaneously, or while another AutoSave is in progress? Does file recovery from AutoSave work as you expect? What happens if the application crashes during an AutoSave? During recovery of an AutoSaved document?

 

 Save and Save as in the following conditions:

o No documents are dirty

o One document is dirty

o Multiple documents are dirty and the user chooses to save all of them

o Multiple documents are dirty and the user chooses to save none of them

o Multiple documents are dirty and the user chooses to save only some of them