SURVEY: Silk Central Reporting
Dear users of Silk Central!
One of the next releases will have a dedicated focus on enhancing Silk Central’s reporting.
For that reason I ask you to provide feedback on:
- Your reporting focus – what are the important questions Silk Central needs to answer for you (and maybe does not answer today)?
- Your usage of shipped reports – which ones do you use frequently?
Thank you very much in advance!
I'll get back with more input in short, but I just wanted to jot down one immediate thing that comes to my mind.
Even though many of the canned reports are useful (I haven't tried all of them yet), we often do ad-hoc analysis of data. The tool we develop and sell do this very well, and we try to use it as often we can to "sip our own champagne".
Hence, I would like to either have a better documentation for how to process SCTM data in external reporting/analysis tools or maybe even a revised or new database schema geared towards analysis in external tools.
Some more thoughts;
- A connection with project management tools, either for export or a live connection - e g MS Project
- Elaborate more on the test planning capabilities and relate this functionality to planning in other parts of a software project
- A simpler way of customizing the reports - e g by templates, easy editing, isolation from underlying SQL etc
- Better support for customizing the dashboard - more gadgets and extension points to create custom gadgets, dashboards and mashups with other systems
- Integration with other systems;
- JIRA/Confluence gadgets (issue tracker systems integrations)
- JIRA (with support for updating estimates and logged work of connected issues)
- CI systems integrations (e g Jenkins) for displaying test status in Jenkins
This may not directly relate to reporting, but is somewhat related:
- Exporting tests and execution plans to Excel, Word, Powerpoint etc.
Currently we are using the following reports:
• Planned vs. actual execution time for manual tests accumulated per test group (self-made).
• An executive summary report of test results consisting of the following parts (self-made integration):
o Status of Requirements with Priority XXX (modified standard report).
o Test type (manual, unit, …) distribution (self-made).
o Table of not covered or not executed requirements (self-made).
o Table representing unresolved issues status and planning details (self-made).
• A document report of all requirements (modified all requirements report).
All these reports are generated with project scope, but we are working on a solution for cross project report generation with (multi) execution definition scope.
Furthermore, we need a report for requirement / test status in the context of particular execution plan parameters (each parameter value representing a configuration option).
SCTM’s report facility is very flexible and the user is able to develop highly customized reports. But what is really annoying is the poor usability:
• Currently it is impossible for a user to recognize that the report task died (e.g. caused by memory overflow). Please, provide better feedback of report generation progress.
• Please, provide the possibility to interrupt a report generation.
• There is need for a better diagnosis of chart server problems. Currently it seems impossible (at least for me) to decide which report causes a log entry in the chart server log.
• Please add the possibility for SCTM administrators to define (and change) default parameters for reports. Also add the possibility for SCTM users do easily reset report parameters to its defaults.
• Currently, one has to maintain the same report in each and every project. Please enhance global report administration.
• We would appreciate the possibility to easily assemble a report out of several (sub-)reports and charts.
• An inherent versioning scheme for report templates would be great.
With best regards,