MicroStrategy ONE

Product Development Methodology

To view a PDF version of only this page, click here.

Presently, the Technology team at MicroStrategy follows an agile product development methodology. This consists of the following groups: Scrum Teams, Product Owners, Product Management, Software Engineering, Software Quality, User Experience Design, Content, Localization, and DevOps.

The scrum teams are closely aligned by their deliverables, working in two-week iteration/sprint cycles that culminate in key ceremonies such as demonstrations of the increment of software delivered in a particular iteration. The “Execution Plan” is defined as six iterations comprising the engineering and quality deliverables for the calendar quarter. The development of MicroStrategy’s products encompasses software architecture, design principles as well as a continuous validation and deployment of the software. During product development, portfolio features (“Features”) are decomposed into their constituent stories. Incremental values associated with these Features are assigned and then resources are tasked accordingly.

MicroStrategy has central source control, versioning, branching, and code management systems across world-wide Technology centers. These systems provide governance, check-in controls, code review enforcement as well as actionable insight on development processes, code coverage, and other fundamental tenets of enterprise software delivery such as static code analysis.

MicroStrategy maintains a centralized system for capturing all aspects of product development and quality management. Quality management includes “QA Plans” which are pre-defined automated and manual test cases that represent an array of user workflows. The centralized system captures work items from all Initiatives and Features, through user stories, tasks, and defects. Initiatives and Features are measured in points while tasks are measured in hours.

The central system is integrated with the source code repository, build process, and analytics to provide transparency for what the teams are working on, the status of those work items, and the size and nature of the source code commits. Moreover, it helps ensure traceability of requirements through software design, code, automation and integration testing. All of this allows for continuous integration, continuous delivery and continuous deployment. This central system is utilized by Product Management, Product Owners, Architects, Engineers, DevOps, Quality Analysts, User Experience, Content, Documentation, and Executive roles.

Release Methodology

MicroStrategy product releases consist of monthly releases for mobile applications and desktop applications. These releases comprise new functionality and product improvements based on innovation and customer feedback. Additionally, every quarter, MicroStrategy releases cumulative updates on top of the latest platform release. As an example, the release schedule in 2022 includes Update 5 in Q1 (11.3.5), Update 6 in Q2, Update 7 in Q3 and Update 8 in Q4. While MicroStrategy platform releases, previously on an annual cadence, customarily deliver significant new capabilities and innovations to the market, as well as incremental product improvements, MicroStrategy Updates focus on a select number of customer-reported items, security, performance and scalability improvements. With improvements to velocity and development practices, MicroStrategy is able to provide the same type of scope in an Update on a quarterly basis, albeit with smaller deltas between releases. These Updates will continue to also include software quality metrics captured in the system of record for customer issues with priority, severity, impact, and pervasiveness. A typical Update cycle for MicroStrategy consists of six iterations where aspects of the platform are validated across environments, persona-driven use cases, functionality, performance, stability, security, scalability, internationalization, compatibility, upgrade, accessibility and other release criteria. Stringent acceptance criteria for these aspects of product quality are reviewed and validated

MicroStrategy also has an escalation process whereby customers can escalate critical issues through their Account Executive and Customer Support liaisons. In addition to the regularly scheduled Updates, MicroStrategy reviews these escalations frequently and may issue software patch releases in response. MicroStrategy patch releases generally roll up cumulatively into the next available MicroStrategy Update and into the next MicroStrategy Platform release.

Quality Programs

Features have stated Acceptance Criteria that typically account for functionality, performance, reliability, security, scalability, internationalization, compatibility, upgrade, and/or accessibility considerations. Before Features are approved into the execution plan, the definition, engineering design, continuous integration / unit tests, architecture, UX workflows, and/or quality test plan are inspected. This methodology enforces rigor before writing code and helps eliminate ambiguity and unify user experience design, engineering, architecture, and test plans into clear goals.

These programs help ensure that throughout iterations, the work items are regularly inspected for adherence to the stated Acceptance Criteria. The overall test plan traditionally considers automated test methodologies for continuous integration tests, and automatic approve / reject based on these tests for any proposed code commit to a central, versioned source code repository.

MicroStrategy Platform builds execute a large array of system tests across the enterprise release criteria, utilizing code quality scans and analysis to ensure integrity before the build is released to the scrum teams. The MicroStrategy builds are presently released on Windows, Linux, Amazon Linux, Red Hat Enterprise Linux, MicroStrategy on AWS, MicroStrategy on Azure, iOS, OSX, and Android Operating Systems. Once a build is released to QA, the scrum teams utilize end-to-end test cases driven from key MicroStrategy personas on the published Map of the Intelligent Enterprise as part of the validation process.

MicroStrategy conducts regular source code security scanning, binary code security scanning, internal penetration testing and third-party independent penetration testing for security vulnerabilities. The type of vulnerabilities tested for are those identified by, but not limited to, the OWASP Top 10 and the SANS Top 25.

MicroStrategy develops and maintains central deployments of its software platform. For each iteration these systems are upgraded, the data validated, and the associated performance benchmark and capacity guidelines are measured and compared against previous validation point. These systems are established through strategic relations with our customers and partners. This is termed the “Customer Validation Program”.

MicroStrategy utilizes their own platform deployed across internal systems to execute mission-critical analytics for the organization. Key stakeholders on these internal systems routinely sign-off on the build as part of the release cycles.

Quality Programs exist to ensure that customer issues escalated to the Technology team are assessed, assigned, scheduled and resolved in a timely fashion. Senior Technology leadership regularly meet in coordination with Technical Support and Sales to ensure that the organization is aligned on the most up to date information. In these meetings, key metrics about the work items are typically reviewed, which can include age, priority, severity, impact, time to first response, and time to close. Items in this meeting are put through Root Cause Analysis (RCA) where the underlying cause for the escalated issue is identified, the problem rectified, any engineering or quality process defects are addressed and items from these sessions are added to the system of record for product development improvements.

Quality Programs partner with Enterprise Support Programs to ensure that personnel are regularly working with our customers. These may take the form of Upgrade Programs, engagement with MicroStrategy’s Consulting department to gain experience in customer use cases, or via UX Design sessions with customers to be able to provide design input early in the development cycle.

MicroStrategy has a robust mechanism to capture feedback from its customers through its Field organization with the intent of roadmap steering and early previews of the Technology strategic direction. This extends to many strategic customer roadmap reviews. Many of the current items on the platform roadmap are a direct result of feedback from these sessions, as well as Enterprise Support and UX Engagements, customer intelligence from Support and from the account teams working in concert to advocate on behalf of the customer.

Product Development Inspection Points

The following table comprises the key agile inspection points and outputs from the Product Development lifecycle at MicroStrategy.

Inspection Description

Release Planning

A transparent and inclusive synchronized planning ceremony that allows stakeholders and teams to understand the objectives, align the deliverables to the corporate vision, and commit to delivering a high-quality product to ensure customer success.

Release Readiness

Organization-wide inspection of acceptance criteria, product development Key Performance Indicators through the lenses of engineering architecture, user experience, customer work items, functionality, performance, scalability, security, internationalization, upgrade, compatibility, accessibility. CXO-level meeting to ensure transparency of deliverables.

Triage Defects

Daily activity for the team to review incoming and open defects, understand their impact, analyze for patterns, prioritize and schedule. Customer defects are reviewed first to ensure thorough assessment and communication with the customer through Technical Support and the QA process.

Change Management

Ceremonies to maintain release focus, quality, and predictability of delivery by closely controlling scope that enters or exits a release once planning is completed. For each change a full impact analysis is performed across key KPI’s including value, cost, and quantified adjustments necessary to delivery execution plan.

Iteration Planning

Rally point where the scrum team commits to the work they can deliver within the iteration. Product Owner clarifies the details of the product backlog items and their respective acceptance criteria, so the delivery team understands the requirements and define the work and effort necessary to meet the commitment.

Daily Stand Up

Daily time-boxed ceremony where the Scrum team meets to gauge the status and progress of their iteration. Team members discuss briefly the work done the day before, work planned for that day and highlights risks so the Product Owner and Scrum Master can take the necessary steps to mitigate the risks.

Scrum of Scrums

Daily time-boxed ceremony for Lead Scrum Masters, and Product Owners, and Product Managers to review progress on the iteration and release, highlight and resolve dependencies, and plan their work for any features that require cross-team effort.

Backlog Refinement

Timeboxed ceremony where the entire Scrum team aligns, understands and decomposes work items that will be introduced in the upcoming iteration. Product Owner presents a ranked backlog to the team and describe work items helping the team to understand the requirements, value proposition and acceptance criteria. The Product Owner receives input form team on dependencies, risks, assumptions, acceptance criteria, and high-level estimates that helps refine the work items prior to the Iteration Planning ceremony.

Iteration Review

Ceremony on the last day of the Iteration where the Scrum team demonstrates the completed product functionality to the rest of the team and stakeholders. It helps the team gather feedback directly from the stakeholders so adjustments can be made to upcoming iterations to align the plan with stakeholders’ requirements and priorities.

Iteration Retrospective

Last Scrum ceremony in an iteration where the Scrum team meets to review performance during the iteration. The team discusses the highlights of the iteration and agrees on a plan to make improvements for the following iteration.