<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:media="http://search.yahoo.com/mrss/"><channel><title><![CDATA[testomat.io changelog: new features, defects resolutions, improvements]]></title><description><![CDATA[The latest changes to testomat.io - next generation test management system for automated tests]]></description><link>https://changelog.testomat.io/</link><generator>Ghost 4.2</generator><lastBuildDate>Wed, 22 Apr 2026 17:41:57 GMT</lastBuildDate><atom:link href="https://changelog.testomat.io/rss/" rel="self" type="application/rss+xml"/><ttl>60</ttl><item><title><![CDATA[Suite-to-Folder Conversion, Bulk Tests Editing, and UI Enhancements]]></title><description><![CDATA[Enhancements to test management including converting suites to folders, bulk editing with test descriptions, and improved UI for Tests and Runs pages]]></description><link>https://changelog.testomat.io/suite-to-folder-conversion-bulk-editing-and-ui-enhancements-in-testomat/</link><guid isPermaLink="false">69bb0a25d1dea73ea043edb5</guid><dc:creator><![CDATA[Testomatio Team ]]></dc:creator><pubDate>Tue, 24 Mar 2026 08:54:42 GMT</pubDate><media:content url="https://changelog.testomat.io/content/images/2026/03/preview-42.5.png" medium="image"/><content:encoded><![CDATA[<!--kg-card-begin: markdown--><h2 id="convert-suite-to-folder">Convert Suite to Folder</h2>
<img src="https://changelog.testomat.io/content/images/2026/03/preview-42.5.png" alt="Suite-to-Folder Conversion, Bulk Tests Editing, and UI Enhancements"><p>We added the ability to <strong>convert</strong> a test suite into a folder. This helps when the initial suite structure needs to be <strong>reorganized</strong> as test coverage grows.</p>
<p>Previously, restructuring required creating new folders and suites manually, then moving tests between them, which made test organization slower and more complex, especially when the structure was not fully defined in advance.</p>
<p>With this update, converting a suite creates a folder with the same name and places the original suite inside it. You can then continue organizing tests within this folder by adding additional suites and building the required structure.</p>
<p>This simplifies test structure refactoring, reduces manual effort when reorganizing tests, and allows teams to evolve their test hierarchy more efficiently as the project grows.</p>
<p><img src="https://changelog.testomat.io/content/images/2026/03/convert-to-folder.gif" alt="Suite-to-Folder Conversion, Bulk Tests Editing, and UI Enhancements" loading="lazy"></p>
<h2 id="expanded-suite-bulk-edit-with-descriptions">Expanded Suite Bulk Edit with Descriptions</h2>
<p>We expanded the suite bulk edit functionality to support editing not only test titles, but also <strong>test descriptions</strong> directly within a single screen.</p>
<p>This allows working with multiple tests in a more structured way, including writing and updating descriptions alongside titles. It also enables creating tests within a suite as part of a larger, continuous flow&#x2014;similar to drafting a single user story and breaking it down into multiple test cases.</p>
<p>For the best experience, this workflow is aligned with the <a href="https://docs.testomat.io/project/import-export/export-tests/classical-tests-markdown-format/">Testomat markdown format for test definitions</a>.</p>
<p>This improvement makes <strong>bulk test creation and editing</strong> more efficient, improves consistency between titles and descriptions, and supports a more natural workflow when defining tests based on requirements or user stories.<br>
<img src="https://changelog.testomat.io/content/images/2026/03/suite-bulk-editor.gif" alt="Suite-to-Folder Conversion, Bulk Tests Editing, and UI Enhancements" loading="lazy"></p>
<h2 id="ui-improvements-for-tests-and-runs-pages">UI Improvements for Tests and Runs Pages</h2>
<p>We updated the <strong>user interface</strong> of the <strong>Tests</strong> and <strong>Runs</strong> pages with new styles to improve clarity, consistency, and usability across test management workflows.</p>
<p>The refreshed <strong>design</strong> enhances visual structure, making it easier to navigate test hierarchies, distinguish between suites and test cases, and review run results. Key information such as statuses, tags, environments, and execution details is now more clearly presented.</p>
<p>The updated UI improves visibility of <strong>test data</strong> and <strong>run results</strong>, simplifies navigation across complex test structures, and provides a more consistent and readable interface for managing tests and analyzing execution outcomes.</p>
<p><img src="https://changelog.testomat.io/content/images/2026/03/Runs_page_ui.png" alt="Suite-to-Folder Conversion, Bulk Tests Editing, and UI Enhancements" loading="lazy"></p>
<p><img src="https://changelog.testomat.io/content/images/2026/03/Tests_page_ui.png" alt="Suite-to-Folder Conversion, Bulk Tests Editing, and UI Enhancements" loading="lazy"></p>
<h2 id="%F0%9F%9B%A0%EF%B8%8F-fixes-and-improvements">&#x1F6E0;&#xFE0F; Fixes and Improvements</h2>
<p><strong>Improvements</strong></p>
<ul>
<li>Improved Zephyr import, preserving <strong>Test Type</strong> data as labels</li>
<li>Added filtering by labels in Automation Coverage <a href="https://github.com/testomatio/app/issues/1344">https://github.com/testomatio/app/issues/1344</a></li>
<li>Added filtering by detached tests in Automation Coverage <a href="https://github.com/testomatio/app/issues/1458">https://github.com/testomatio/app/issues/1458</a></li>
<li>Added ability to duplicate metadata when duplicating tests <a href="https://github.com/testomatio/app/issues/1309">https://github.com/testomatio/app/issues/1309</a></li>
<li>Improved counter processing for Run Groups <a href="https://github.com/testomatio/app/issues/1359">https://github.com/testomatio/app/issues/1359</a></li>
<li>Improved processing of Owner and CreatedBy fields when importing XLSX/CSV in Testomat format <a href="https://github.com/testomatio/app/issues/1516">https://github.com/testomatio/app/issues/1516</a></li>
<li>Improved formatting when saving AI-generated suite summaries <a href="https://github.com/testomatio/app/issues/1518">https://github.com/testomatio/app/issues/1518</a></li>
<li>Added support for commas in parameters (examples) <a href="https://github.com/testomatio/app/issues/1539">https://github.com/testomatio/app/issues/1539</a></li>
</ul>
<p><strong>Fixes</strong></p>
<ul>
<li>Fixed transition from read-only user to Accountant</li>
<li>Fixed Run Duration view <a href="https://github.com/testomatio/app/issues/1423">https://github.com/testomatio/app/issues/1423</a></li>
<li>Fixed updating test descriptions after assigning status <a href="https://github.com/testomatio/app/issues/1450">https://github.com/testomatio/app/issues/1450</a></li>
<li>Fixed rendering of tags starting with uppercase letters <a href="https://github.com/testomatio/app/issues/1462">https://github.com/testomatio/app/issues/1462</a></li>
<li>Fixed &quot;Explain Failure&quot; error: &quot;messages[1].content must be a string&quot; <a href="https://github.com/testomatio/app/issues/1481">https://github.com/testomatio/app/issues/1481</a></li>
<li>Fixed partial deletion of tests after adding images or pasting text/images <a href="https://github.com/testomatio/app/issues/1482">https://github.com/testomatio/app/issues/1482</a></li>
<li>Fixed merging of test parameters when merging a branch <a href="https://github.com/testomatio/app/issues/1485">https://github.com/testomatio/app/issues/1485</a></li>
<li>Fixed processing of Cyrillic characters when exporting to Markdown <a href="https://github.com/testomatio/app/issues/1489">https://github.com/testomatio/app/issues/1489</a></li>
<li>Fixed &quot;undefined method <code>suite</code> for nil&quot; error in Chat with Tests <a href="https://github.com/testomatio/app/issues/1499">https://github.com/testomatio/app/issues/1499</a></li>
<li>Fixed AI provider request failure when using Explain Failure follow-up <a href="https://github.com/testomatio/app/issues/1504">https://github.com/testomatio/app/issues/1504</a></li>
<li>Fixed processing of automated tests in mixed runs <a href="https://github.com/testomatio/app/issues/1524">https://github.com/testomatio/app/issues/1524</a></li>
<li>Fixed filter duplication in the opened sidebar <a href="https://github.com/testomatio/app/issues/1535">https://github.com/testomatio/app/issues/1535</a></li>
<li>Fixed pagination in the &quot;Jira Issues&quot; tab on the Tests page <a href="https://github.com/testomatio/app/issues/1536">https://github.com/testomatio/app/issues/1536</a></li>
<li>Fixed Pin mode causing UI issues and missing &quot;Save&quot; button on the Templates page <a href="https://github.com/testomatio/app/issues/1546">https://github.com/testomatio/app/issues/1546</a></li>
</ul>
<div style="text-align:center;"> &#x1F680; Create a <a href="https://app.testomat.io">demo project for free</a> and check all amazing features right now. We look forward to <a href="https://testomat.nolt.io">hearing what you think</a>, and what other features we should do!</div>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Improved Test Plans, Shared Tests, Reporting, and UI Enhancements]]></title><description><![CDATA[Discover Testomat’s latest updates: redesigned New Plan, shared tests across projects, improved UI, faster test results, and enhanced copy features]]></description><link>https://changelog.testomat.io/improved-test-plans-shared-tests-reporting-and-ui-enhancements/</link><guid isPermaLink="false">6968c349acbcb904c810bfd3</guid><category><![CDATA[cloud]]></category><category><![CDATA[on-premise]]></category><dc:creator><![CDATA[Testomatio Team ]]></dc:creator><pubDate>Thu, 05 Feb 2026 05:20:16 GMT</pubDate><media:content url="https://changelog.testomat.io/content/images/2026/02/preview423.png" medium="image"/><content:encoded><![CDATA[<!--kg-card-begin: markdown--><h2 id="redesigned-new-plan-screen-with-combined-test-selection">Redesigned New Plan Screen with Combined Test Selection</h2>
<img src="https://changelog.testomat.io/content/images/2026/02/preview423.png" alt="Improved Test Plans, Shared Tests, Reporting, and UI Enhancements"><p>We redesigned the <strong>New Plan screen</strong> to improve the usability and flexibility of <strong>Test Plan creation</strong>. Users can now combine filtering and manual selection of <strong>test cases and test suites</strong> within a single workflow.</p>
<p>This allows adding tests from different sources using multiple criteria. For example, you can include <strong>high-priority tests</strong> from one suite and <strong>labeled tests</strong> from another suite in the same plan. Multiple queries (<strong>test collections</strong>) can be combined into a single <strong>Test Plan</strong>.</p>
<p>The updated flow makes <strong>Test Plan creation</strong> more flexible and precise, reduces the need for separate plans or manual adjustments, and helps teams assemble <strong>test coverage</strong> that better matches real <strong>test execution</strong> and release needs.</p>
<p><img src="https://changelog.testomat.io/content/images/2026/02/new-plan.gif" alt="Improved Test Plans, Shared Tests, Reporting, and UI Enhancements" loading="lazy"></p>
<h2 id="shared-tests-across-projects">Shared Tests Across Projects</h2>
<p>We introduced <strong>shared tests</strong>, allowing <strong>test folders and test suites</strong> from a source project to be shared with one or multiple <strong>projects</strong>. This makes it possible to reuse common <strong>test structures</strong> and scenarios across teams and products without duplicating data.</p>
<p>Shared suites remain synchronized with the <strong>source project</strong>. Any updates to the original test folders, suites, or <strong>test cases</strong> are automatically propagated to all connected projects, ensuring consistency of shared <strong>test coverage</strong>.</p>
<p>At the same time, test cases in <strong>target projects</strong> can be adjusted independently. Changes made to test cases in a consuming project do not affect the source project or other projects using the same shared suites.</p>
<p><strong>Benefits</strong></p>
<ul>
<li>Reuse <strong>test folders and suites</strong> across one or multiple projects</li>
<li>Keep shared <strong>test structures</strong> and content synchronized automatically</li>
<li>Reduce <strong>test duplication</strong> and maintenance effort</li>
<li>Allow <strong>project-specific customization</strong> without impacting the source project</li>
</ul>
<p><img src="https://changelog.testomat.io/content/images/2026/02/share-tests.gif" alt="Improved Test Plans, Shared Tests, Reporting, and UI Enhancements" loading="lazy"></p>
<h2 id="updated-new-run-ui-with-launch-type-tabs">Updated New Run UI with Launch Type Tabs</h2>
<p>We updated the <strong>New Run screen</strong> and introduced tabs for different <strong>test launch types</strong>. This allows users to switch between run modes and focus only on the options relevant to the selected launch type.</p>
<p>By separating <strong>run configurations</strong> into dedicated tabs, the <strong>New Run flow</strong> becomes clearer and less cluttered, making it easier to set up the required type of <strong>test execution</strong> without distraction.</p>
<p><img src="https://changelog.testomat.io/content/images/2026/01/launch-run-screen.gif" alt="Improved Test Plans, Shared Tests, Reporting, and UI Enhancements" loading="lazy"></p>
<h2 id="passing-suite-ids-to-ci-from-new-run-view">Passing Suite IDs to CI from New Run View</h2>
<p>We added the ability to select and pass <strong>suite IDs</strong> to <strong>CI</strong> when creating a new run. This is available in the <strong>New Run</strong> view as a multi-select input, allowing you to define which <strong>test suites</strong> should be executed as part of a <strong>mixed run</strong>.</p>
<p>The multi-select is implemented as a dropdown, where you can choose the required suites by their IDs. The selected suite IDs are then passed to your <strong>CI system</strong>.</p>
<p>To use this feature, you need to adjust your <strong>CI workflow</strong> by adding grep by suites and update your <strong>Testomat CI configuration</strong> to accept and process the passed suite IDs. See example <a href="https://github.com/testomatio/app/issues/1229#issuecomment-3728395965">here</a>.</p>
<h2 id="improved-test-result-view-ui">Improved Test Result View UI</h2>
<p>We improved the user interface of the <strong>individual test result view</strong>, focusing on layout, readability, and navigation.</p>
<p>The updated design makes <strong>test results</strong> easier to view and understand, while UI optimizations improve <strong>rendering performance</strong>. Navigation within test results is now clearer, helping users review large or complex <strong>test outputs</strong> more efficiently.</p>
<p>The improved layout and visual structure make individual test results easier to read and navigate, while UI optimizations ensure faster rendering and a smoother <strong>test review</strong> experience, especially for detailed or large test outputs.</p>
<p><img src="https://changelog.testomat.io/content/images/2026/01/new-result-view.gif" alt="Improved Test Plans, Shared Tests, Reporting, and UI Enhancements" loading="lazy"></p>
<h2 id="expanded-copying-for-tests-and-suites">Expanded Copying for Tests and Suites</h2>
<p>We expanded the copying capabilities for <strong>tests and test suites</strong>. In addition to the test description, you can now copy <strong>labels</strong>, <strong>linked issues</strong> (including <strong>Jira</strong>), and <strong>test attachments</strong>.</p>
<p>The copy flow has been enhanced to make these options explicit and configurable in the UI. When initiating a copy action, the system now clearly shows which <strong>test metadata</strong> will be included, allowing users to review and confirm the scope of the copy before proceeding.</p>
<p><strong>Updated UI flow</strong></p>
<ul>
<li>
<p>Open a test or test suite</p>
</li>
<li>
<p>Select the <strong>Copy</strong> action from the available options</p>
</li>
<li>
<p>In the copy dialog, review the included data:</p>
<ul>
<li>labels</li>
<li>linked issues (including <strong>Jira issues</strong>)</li>
<li>attachments</li>
</ul>
</li>
<li>
<p>Confirm the copy action to create a new test or suite with the selected metadata preserved</p>
</li>
</ul>
<p>The detailed copy flow ensures tests and suites are duplicated with full context, reducing manual follow-up work and preventing data loss. This helps teams reuse tests faster while keeping <strong>labels</strong>, <strong>issue links</strong>, and <strong>attachments</strong> consistent across copied items.</p>
<p><img src="https://changelog.testomat.io/content/images/2026/01/copy-test-meta.gif" alt="Improved Test Plans, Shared Tests, Reporting, and UI Enhancements" loading="lazy"></p>
<h2 id="pin-mode-and-display-settings-for-tests-page">Pin Mode and Display Settings for Tests Page</h2>
<p>We introduced <strong>Pin mode</strong> for the <strong>tests and suites view</strong>, allowing users to keep a fixed side panel when viewing a test or suite. This layout is similar to other <strong>test management systems</strong> and helps make <strong>Testomat adoption</strong> easier.</p>
<p>To support this, we added <strong>Display settings</strong> on the <strong>Tests page</strong>, where you can control how the <strong>test structure</strong> and side view are displayed.</p>
<p><strong>Display settings options</strong></p>
<ul>
<li><strong>Hide tests in tree</strong> &#x2014; shows only folders and suites in the main tree, helping keep the <strong>project structure</strong> clean and easier to navigate</li>
<li><strong>Pin sidebar</strong> &#x2014; keeps the suite or test view fixed in a side panel while browsing the test tree</li>
</ul>
<p><img src="https://changelog.testomat.io/content/images/2026/01/display-set.png" alt="Improved Test Plans, Shared Tests, Reporting, and UI Enhancements" loading="lazy"></p>
<p>The new <strong>Pin mode</strong> and display options provide a more familiar and flexible layout, improve navigation in large <strong>test structures</strong>, and make it easier to review and manage tests without losing context.</p>
<p><img src="https://changelog.testomat.io/content/images/2026/01/clear-view.gif" alt="Improved Test Plans, Shared Tests, Reporting, and UI Enhancements" loading="lazy"></p>
<h2 id="optimized-test-result-processing">Optimized Test Result Processing</h2>
<p>We optimized the reporting and processing of <strong>test results</strong> submitted via <strong>XML</strong>. Results are now delivered to <strong>Testomat</strong> faster, especially for large <strong>test runs</strong>.</p>
<p>This improvement resolves an issue where some test results could be missing in runs with thousands of tests and eliminates delays when updating tests and <strong>execution results</strong>.</p>
<p><strong>Benefits</strong></p>
<ul>
<li>Faster availability of test results in <strong>large-scale test runs</strong></li>
<li>More reliable and complete <strong>run data</strong></li>
<li>Reduced delays when updating tests and <strong>execution statuses</strong></li>
</ul>
<h2 id="%F0%9F%9B%A0%EF%B8%8F-fixes-and-improvements">&#x1F6E0;&#xFE0F; Fixes and Improvements</h2>
<ul>
<li>Fixed launching multi-environmets runs <a href="https://github.com/testomatio/app/issues/1445">https://github.com/testomatio/app/issues/1445</a></li>
<li>Fixed multiple runs creation when using TESTOMATIO_SHARED_RUN_TIMEOUT <a href="https://github.com/testomatio/app/issues/1410">https://github.com/testomatio/app/issues/1410</a></li>
<li>Improved traces processing BDD projects <a href="https://github.com/testomatio/app/issues/1370">https://github.com/testomatio/app/issues/1370</a></li>
<li>Improved editing plan for ongoing run so excluded tests are removed <a href="https://github.com/testomatio/app/issues/1330">https://github.com/testomatio/app/issues/1330</a></li>
<li>Improved processing tags for automated tests <a href="https://github.com/testomatio/app/issues/1236">https://github.com/testomatio/app/issues/1236</a></li>
<li>Removed showing failures 0 in Slack notification <a href="https://github.com/testomatio/app/issues/900">https://github.com/testomatio/app/issues/900</a></li>
<li>Improved importing suites inside another suite for JS frameworks and markdown <a href="https://github.com/testomatio/check-tests/issues/198">https://github.com/testomatio/check-tests/issues/198</a>, <a href="https://github.com/testomatio/app/issues/1408">https://github.com/testomatio/app/issues/1408</a></li>
<li>Fixed export to markdown for cyrylic <a href="https://github.com/testomatio/app/issues/1489">https://github.com/testomatio/app/issues/1489</a></li>
</ul>
<div style="text-align:center;"> &#x1F680; Create a <a href="https://app.testomat.io">demo project for free</a> and check all amazing features right now. We look forward to <a href="https://testomat.nolt.io">hearing what you think</a>, and what other features we should do!</div>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[New Requirements, Improved Test Launching, Robot Framework Support & MCP Server]]></title><description><![CDATA[Requirements management, new requirement sources, MCP Server setup from the UI, Robot Framework support, and improved Global Analytics widgets.]]></description><link>https://changelog.testomat.io/new-requirements-improved-test-launching-robot-framework-support-mcp-server/</link><guid isPermaLink="false">692816a6648a8d04c6a9ebfe</guid><category><![CDATA[cloud]]></category><category><![CDATA[on-premise]]></category><dc:creator><![CDATA[Testomatio Team ]]></dc:creator><pubDate>Fri, 05 Dec 2025 06:46:55 GMT</pubDate><media:content url="https://changelog.testomat.io/content/images/2025/12/preview42-2.png" medium="image"/><content:encoded><![CDATA[<!--kg-card-begin: markdown--><h1 id="ai-requirements-update">AI Requirements Update</h1>
<h2 id="new-requirements-management-page">New Requirements Management Page</h2>
<img src="https://changelog.testomat.io/content/images/2025/12/preview42-2.png" alt="New Requirements, Improved Test Launching, Robot Framework Support &amp; MCP Server"><p>We introduced a dedicated <strong>Requirements</strong> page that brings all your requirements into a single, convenient workspace. Now you can view, organize, and update requirements in one place &#x2014; without jumping across different parts of the project.</p>
<p>This improvement makes it easier to:</p>
<ul>
<li>browse and manage large requirement sets</li>
<li>navigate between requirements and linked tests</li>
<li>keep your requirement coverage clean and structured</li>
</ul>
<p>A more scalable workflow for teams working with complex specifications.</p>
<p><img src="https://changelog.testomat.io/content/images/2025/12/Requirements-page.png" alt="New Requirements, Improved Test Launching, Robot Framework Support &amp; MCP Server" loading="lazy"></p>
<h2 id="new-requirement-sources-files-text">New Requirement Sources: Files &amp; Text</h2>
<p>You now have more flexible ways to create requirements:</p>
<p><strong>Supported sources include:</strong></p>
<ul>
<li><strong>Files</strong> &#x2014; PDF, DOCX, XLSX, CSV, TXT, Markdown, and others</li>
<li><strong>Plain text</strong> &#x2014; paste any text and instantly convert it into structured requirements</li>
</ul>
<p>This makes it much easier to import requirements from PRDs, customer documentation, spreadsheets, briefs, or any internal artifacts your team already uses.</p>
<h2 id="global-requirements-setup">Global Requirements Setup</h2>
<p>You can now <strong>set Requirements as Global</strong>, allowing the system to automatically attach selected requirements to <strong>every newly created test</strong>.<br>
This is especially useful for teams working with mandatory compliance rules, overarching documentation, or project-wide standards that must be reflected in all tests.</p>
<p>To use it:<br>
<strong>Open Project &#x2192; Requirements &#x2192; Select a Requirement &#x2192; Global Requirements</strong></p>
<p>This ensures consistency and saves time by eliminating repeated manual assignment.</p>
<p>Here is the updated paragraph title with an emoji &#x2014; clean and appropriate for a changelog:</p>
<h2 id="ai-powered-image-understanding-for-requirements">AI-Powered Image Understanding for Requirements</h2>
<p>You can now attach <strong>images directly to your requirements</strong>, and AI will automatically analyze them and include their content in the requirement summary. This enriched summary is then used during <strong>test generation</strong>, ensuring that visual details are not missed.</p>
<p>All attached images are also <strong>automatically added to the tests</strong> created from these requirements.</p>
<p>The same behavior now applies to <strong>Jira- and Confluence-based requirements</strong>:<br>
if your Jira ticket or Confluence page includes attachments (images, diagrams, files), Testomat.io will <strong>import them, analyze them, and enrich the requirements summary</strong> &#x2014; giving you more accurate AI-generated tests.</p>
<h1 id="%F0%9F%9B%A0%EF%B8%8F-easy-mcp-server-setup-from-project-ui">&#x1F6E0;&#xFE0F; Easy MCP Server Setup from Project UI</h1>
<p>You can now quickly connect the <strong>Testomat.io MCP Server</strong> directly from your project interface.</p>
<p>The <strong>Model Context Protocol (MCP)</strong> allows AI assistants like <strong>Claude Desktop, Cursor, and Zed</strong> to securely access and manage your test cases.<br>
With this release, we&#x2019;ve made configuration effortless &#x2014; no need to search documentation.</p>
<p><strong>How to access:</strong><br>
<strong>Open your project &#x2192; Extra menu (top-right corner) &#x2192; MCP Server</strong></p>
<p>Inside, you will find step-by-step instructions for connecting the Testomat.io MCP Server to your preferred AI tool.</p>
<p>This makes integrating AI into your workflow faster, smoother, and more accessible for your entire team.</p>
<p>You can find more details on Testomat.io MCP Server <a href="https://github.com/testomatio/mcp">here</a>.</p>
<p><img src="https://changelog.testomat.io/content/images/2025/12/MCP.png" alt="New Requirements, Improved Test Launching, Robot Framework Support &amp; MCP Server" loading="lazy"></p>
<h1 id="%F0%9F%A4%96-robot-framework-support">&#x1F916; Robot Framework Support</h1>
<p><strong>Robot Framework</strong> is a popular Python based automation framework used for acceptance testing, ATDD, and RPA. It uses keyword-driven testing, making tests readable and easy to maintain.</p>
<p>With this update, you can now <strong>import Robot Framework tests into Testomat.io</strong> and <strong>report test results</strong> directly from your CI/CD pipelines or local runs.</p>
<p><strong>Benefits of using Testomat.io for Robot Framework tests:</strong></p>
<ul>
<li>Centralized test reporting and management for all Robot tests</li>
<li>Traceability of test results and easy linking to requirements or issues</li>
<li>Insights into test coverage, flaky tests, and execution trends</li>
<li>Ability to combine Robot tests with manual and other automated tests in mixed runs for unified reporting</li>
</ul>
<p>You can find more details Testomat.io plugin for Robot Framework <a href="https://github.com/testomatio/robot-framework-reporter">here</a>.</p>
<h1 id="%F0%9F%8C%90-global-analytics-improvements">&#x1F310; Global Analytics Improvements</h1>
<p>We refined and improved the <strong>Global Analytics</strong> experience to make it more intuitive and visually appealing.</p>
<p><strong>What&#x2019;s new:</strong></p>
<ul>
<li><strong>Simplified widget creation</strong> &#x2013; it&#x2019;s now smoother and faster to configure analytics widgets for your workspace</li>
<li><strong>Refreshed UI</strong> &#x2013; widgets have an updated, cleaner look for better readability and a more modern analytics dashboard</li>
</ul>
<p>These enhancements make it easier to build meaningful insights and keep your global Testomat.io analytics organized and clear.</p>
<p><img src="https://changelog.testomat.io/content/images/2025/12/create-widget.png" alt="New Requirements, Improved Test Launching, Robot Framework Support &amp; MCP Server" loading="lazy"></p>
<h1 id="%F0%9F%94%97-linking-automated-tests-to-manual-test-cases">&#x1F517; Linking Automated Tests to Manual Test Cases</h1>
<p>We&#x2019;ve expanded how <strong>automated test results</strong> can be connected with <strong>manual test cases</strong>, giving you full control over how tests appear in reports. This improvement ensures that all scenarios &#x2014; automated or manual &#x2014; are properly tracked and reported.</p>
<h2 id="key-updates">Key Updates</h2>
<h3 id="automated-runs-default"><strong>Automated Runs (Default)</strong></h3>
<p>Automated runs remain focused on automated test execution results. Manual test cases linked via <code>linkTest()</code> are shown as references but do not count as executed tests. This keeps automated test reports clean while still showing related manual coverage.</p>
<h3 id="manual-runs"><strong>Manual Runs</strong></h3>
<p>In manual runs, you can link automated tests to manual test cases. Only manual test cases appear in the report, while automated tests remain hidden. This is useful when automated checks verify manual test status without inflating execution counts.</p>
<h3 id="mixed-runs"><strong>Mixed Runs</strong></h3>
<p>Mixed runs combine both <strong>automated execution</strong> and <strong>manual test results</strong> in a single report. Each appears as a separate entry, giving you a complete and accurate view of test coverage.</p>
<h3 id="how-it-works">How It Works</h3>
<pre><code class="language-bash"># 1. Create a run (manual or mixed)
RUN_ID=$(TESTOMATIO=tstmt_xxxx npx @testomatio/reporter start --kind mixed | tail -n 1)

# 2. Run your automated tests while linking to the created run
TESTOMATIO=tstmt_xxxx TESTOMATIO_RUN=$RUN_ID &lt;run tests command&gt;

# 3. Or use the one-step approach
TESTOMATIO=tstmt_xxxx npx @testomatio/reporter run &quot;&lt;run tests command&gt;&quot; --kind mixed
</code></pre>
<p>This flexible linking system ensures that automated and manual tests are fully integrated, giving teams better insight into testing progress and coverage.</p>
<p>Find more details <a href="https://github.com/testomatio/reporter/blob/2.x/docs/linking-tests.md">here</a>.</p>
<h1 id="%F0%9F%8E%AF-assign-priority-link-issues-while-creating-or-editing-tests">&#x1F3AF; Assign Priority &amp; Link Issues While Creating or Editing Tests</h1>
<p>We streamlined the test creation flow!<br>
Now, when <strong>creating or editing a test</strong>, you can immediately:</p>
<ul>
<li><strong>Set test priority</strong></li>
<li><strong>Link the test to an issue</strong> (Jira, GitHub, etc.)</li>
</ul>
<p>This removes the need for additional edits and helps maintain consistent test documentation from the start.</p>
<p>&#x1F4CD; <em>This option is available now directly in the test creation/edit modal.</em></p>
<p><img src="https://changelog.testomat.io/content/images/2025/11/prio-link-test.png" alt="New Requirements, Improved Test Launching, Robot Framework Support &amp; MCP Server" loading="lazy"></p>
<h1 id="%F0%9F%9A%80-launch-automated-tests-directly-from-the-test-tree">&#x1F680; Launch Automated Tests Directly from the Test Tree</h1>
<p>We expanded the ability to trigger automated tests right from the <strong>Tests page</strong>. You can now <strong>multiselect tests or suites</strong> in the Test Tree and launch:</p>
<ul>
<li>Manual runs</li>
<li>Automated runs</li>
<li>Mixed runs</li>
</ul>
<p>This provides a faster workflow &#x2014; especially when working with hybrid projects combining manual and automated testing.</p>
<p><img src="https://changelog.testomat.io/content/images/2025/11/launch-tests.gif" alt="New Requirements, Improved Test Launching, Robot Framework Support &amp; MCP Server" loading="lazy"></p>
<h1 id="%E2%9E%95-create-an-empty-test-run-build-it-on-the-go">&#x2795; Create an Empty Test Run &amp; Build It on the Go</h1>
<p>We introduced a new way to start your testing process &#x2014; <strong>create an empty Test Run first</strong>, then <strong>add or create tests as you go</strong>.</p>
<p>Ideal for:</p>
<ul>
<li>Exploratory testing</li>
<li>Quick verification tasks</li>
<li>Creating tests from findings during execution</li>
</ul>
<p>Your Test Run can now evolve <strong>in real time</strong>, adapting to your testing flow without requiring upfront setup.</p>
<p><img src="https://changelog.testomat.io/content/images/2025/11/no-tests-launch.png" alt="New Requirements, Improved Test Launching, Robot Framework Support &amp; MCP Server" loading="lazy"></p>
<h1 id="%F0%9F%A7%AD-issues-traceability-coverage-report">&#x1F9ED; Issues Traceability Coverage Report</h1>
<p>We added a new <strong>Traceability report</strong> to help you see how well your issues are covered by tests. Now you can easily track test coverage for Jira or other linked issues across your project.</p>
<p><strong>Where to find it:</strong><br>
<strong>Analytics &#x2192; Issues widget &#x2192; Extra menu &#x2192; Download Traceability report</strong></p>
<p><strong>Pro tip:</strong><br>
Use filters to narrow down results: <strong>Environments, Tags, Labels, Date Range, Priority</strong></p>
<p>This report is especially useful for release readiness checks and coverage audits.</p>
<p><img src="https://changelog.testomat.io/content/images/2025/11/issues-report.png" alt="New Requirements, Improved Test Launching, Robot Framework Support &amp; MCP Server" loading="lazy"></p>
<h2 id="%F0%9F%9B%A0%EF%B8%8F-fixes-and-improvements">&#x1F6E0;&#xFE0F; Fixes and Improvements</h2>
<ul>
<li>Fixed timestamps in Test Run tab &#x2014; now showing exact execution time</li>
<li>Fixed removing tests after launching tests from the Tests Tree</li>
<li>Fixed UI updating when adding/removing tags in bulk</li>
<li>Fixed reporting automated test steps via API</li>
</ul>
<div style="text-align:center;"> &#x1F680; Create a <a href="https://app.testomat.io">demo project for free</a> and check all amazing features right now. We look forward to <a href="https://testomat.nolt.io">hearing what you think</a>, and what other features we should do!</div><!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[AI Test Review, Test Run Notes, and Company-Wide Statistics]]></title><description><![CDATA[Add Test Run Notes for quick temporary checks, explore new Company Statistics widgets, and import Jira links directly from spreadsheets.]]></description><link>https://changelog.testomat.io/ai-test-review-test-run-notes-and-company-wide-statistics/</link><guid isPermaLink="false">68f9e3c2d63c9b0567031748</guid><category><![CDATA[cloud]]></category><category><![CDATA[on-premise]]></category><dc:creator><![CDATA[Testomatio Team ]]></dc:creator><pubDate>Mon, 10 Nov 2025 07:21:57 GMT</pubDate><media:content url="https://changelog.testomat.io/content/images/2025/11/preview-notes.png" medium="image"/><content:encoded><![CDATA[<!--kg-card-begin: markdown--><h1 id="test-case-code-quality-review">Test Case &amp; Code Quality Review</h1>
<img src="https://changelog.testomat.io/content/images/2025/11/preview-notes.png" alt="AI Test Review, Test Run Notes, and Company-Wide Statistics"><p>We introduced <strong>AI-powered Quality Review</strong> for both <strong>manual</strong> test descriptions and <strong>automated</strong> test code.<br>
This feature analyzes your tests and provides <strong>intelligent feedback</strong> with actionable advice to improve clarity, structure, and best practices.</p>
<p>You can use it in two ways:</p>
<ul>
<li><strong>Test Case Quality Review</strong> &#x2013; Evaluates the quality of a manual test&#x2019;s description, suggesting improvements for readability, consistency, and completeness.</li>
<li><strong>Test Code Quality Review</strong> &#x2013; Reviews your automated test code to detect potential issues, enhance maintainability, and align with testing standards.</li>
</ul>
<p><strong>How to use:</strong><br>
Open any test &#x2192; choose <strong>Description</strong> or <strong>Code</strong> &#x2192; click the <strong>AI button</strong> &#x2192; select <strong>Code Quality Review</strong> or <strong>Test Quality Review</strong>.</p>
<p><img src="https://changelog.testomat.io/content/images/2025/11/qualiti-review.gif" alt="AI Test Review, Test Run Notes, and Company-Wide Statistics" loading="lazy"></p>
<h1 id="company-statistics-widgets">Company Statistics Widgets</h1>
<p>We&#x2019;ve introduced <strong>Company Statistics</strong>, a new feature that provides an overview of activity across all projects within your organization.<br>
This update includes three insightful widgets:</p>
<ul>
<li><strong>Data Statistics</strong> &#x2013; Tracks model activity such as Tests, Plans, Suites, Runs, Imports, and Test Runs. Displays created, updated, and deleted counts per project for complete visibility of repository changes.</li>
<li><strong>User Activity</strong> &#x2013; Shows detailed insights into user behavior, including logins, actions performed on tests and other models, and manual test run activity across the organization.</li>
<li><strong>AI Usage</strong> &#x2013; Monitors AI prompt utilization, displaying success rates, the number of active users, and usage distribution by prompt type across all projects.</li>
</ul>
<p>&#x1F4CA; You can also export company usage data to a spreadsheet for reporting, sharing, or deeper analysis.</p>
<p>This option is available for users with <strong>Owner</strong> and <strong>Manager</strong> roles.<br>
To access it, go to <strong>Companies &#x2192; open your Company &#x2192; Extra menu button &#x2192; Statistics</strong>.</p>
<p><img src="https://changelog.testomat.io/content/images/2025/10/CleanShot-2025-10-23-at-10.14.14@2x.png" alt="AI Test Review, Test Run Notes, and Company-Wide Statistics" loading="lazy"></p>
<h1 id="notes-temporary-test-cases">Notes (Temporary Test Cases)</h1>
<p>Introducing <strong>Notes</strong> &#x2014; lightweight, temporary test cases designed for quick or one-time checks during your testing process. They help teams record ad-hoc test ideas or short-term validations without adding unnecessary clutter to the main test repository.</p>
<p><strong>Key Highlights:</strong></p>
<ul>
<li><strong>Quick creation</strong> &#x2013; Instantly create a Note within an active test run or test plan.</li>
<li><strong>Isolated from repository</strong> &#x2013; Notes stay separate from the permanent test hierarchy, keeping your project clean and focused.</li>
<li><strong>Convertible</strong> &#x2013; When a Note turns out to be valuable, you can easily convert it into a permanent test case.</li>
</ul>
<p>This feature is ideal for exploratory sessions, short-term fixes, or documenting spontaneous test scenarios during a run.</p>
<p>Learn how it works in <a href="https://docs.testomat.io/project/runs/temporary-tests-notes">user guide</a>.</p>
<p><img src="https://changelog.testomat.io/content/images/2025/11/Notes.gif" alt="AI Test Review, Test Run Notes, and Company-Wide Statistics" loading="lazy"></p>
<h1 id="drag-and-drop-in-side-view">Drag-and-Drop in Side View</h1>
<p>We&#x2019;ve enhanced navigation and organization within the <strong>Side View</strong> by adding <strong>drag-and-drop functionality</strong>.<br>
Now you can easily reorder items without switching context:</p>
<ul>
<li><strong>Reorder tests within a suite</strong> &#x2013; simply drag tests to adjust their order.</li>
<li><strong>Reorder suites within a folder</strong> &#x2013; drag entire suites to reorganize structure within the same folder.</li>
</ul>
<p>This improvement is especially valuable for projects with <strong>deeply nested hierarchies</strong>, helping teams manage large repositories more efficiently while keeping changes local to the current context.</p>
<p><img src="https://changelog.testomat.io/content/images/2025/11/d-n-d-side.gif" alt="AI Test Review, Test Run Notes, and Company-Wide Statistics" loading="lazy"></p>
<h1 id="import-jira-links-from-spreadsheet">Import Jira Links from Spreadsheet</h1>
<p>We&#x2019;ve expanded the <strong>spreadsheet import</strong> functionality to support <strong>Jira (and other issue tracker) links</strong>.</p>
<p>Now, when you import tests using the <a href="https://docs.testomat.io/project/import-export/import/import-tests-from-csv-xlsx/#how-to-%D1%81reate-custom-xls-for-testomatio">Testomat.io spreadsheet format</a>, the corresponding <strong>Jira issues will be automatically linked</strong> to your test cases during import.</p>
<p><strong>How it works:</strong></p>
<ul>
<li>&#x1F4C4; Use the Testomat.io spreadsheet template, now enriched with an <strong>&#x201C;Issues&#x201D;</strong> column.</li>
<li>&#x1F517; When you import, Testomat.io will automatically connect tests to their related Jira (or other issue tracker) items.</li>
<li>&#x2699;&#xFE0F; Make sure your <strong>Issue Management integration</strong> is configured before importing &#x2014; <a href="https://docs.testomat.io/integrations/issues-management/">see setup guide</a> for details.</li>
</ul>
<p>This improvement streamlines migration and bulk data import by keeping all test-to-issue relationships intact right from the start.</p>
<p><img src="https://changelog.testomat.io/content/images/2025/11/import-jira-links.png" alt="AI Test Review, Test Run Notes, and Company-Wide Statistics" loading="lazy"></p>
<h1 id="plan-updates-now-reflected-in-pulse">Plan Updates Now Reflected in Pulse</h1>
<p>We&#x2019;ve enhanced <strong>Pulse</strong> to provide even deeper visibility into your project activity. Now, any <strong>changes made to Test Plans</strong> &#x2014; such as edits, updates, or new plan creations &#x2014; are <strong>instantly reflected in Pulse</strong>.</p>
<p>This improvement ensures that your team can easily track not only test runs and results but also strategic updates in your testing workflow, all in one place.</p>
<p><img src="https://changelog.testomat.io/content/images/2025/11/plan--pulse.png" alt="AI Test Review, Test Run Notes, and Company-Wide Statistics" loading="lazy"></p>
<h2 id="fixes-and-improvements-%F0%9F%9B%A0%EF%B8%8F">Fixes and Improvements &#x1F6E0;&#xFE0F;</h2>
<ul>
<li>Done optimization in search - providing faster search results</li>
<li>Added more options for quick test creation in test tree</li>
<li>Fixed showing Jira statuses when hovering Jira issue</li>
<li>Fixed Test Run History duplication and filtering <a href="https://github.com/testomatio/app/issues/1425">https://github.com/testomatio/app/issues/1425</a></li>
</ul>
<div style="text-align:center;"> &#x1F680; Create a <a href="https://app.testomat.io">demo project for free</a> and check all amazing features right now. We look forward to <a href="https://testomat.nolt.io">hearing what you think</a>, and what other features we should do!</div><!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Java Integration, Telegram Reports, and Jira Without Admins Setup]]></title><description><![CDATA[New in Testomat.io: Java integration, Telegram notifications, smarter test management with AI, Jira setup without admin rights, and more.]]></description><link>https://changelog.testomat.io/java-integration-telegram-reports-and-jira-without-admins-setup/</link><guid isPermaLink="false">68bfcedf8de77704b7ef4046</guid><category><![CDATA[cloud]]></category><category><![CDATA[on-premise]]></category><dc:creator><![CDATA[Testomatio Team ]]></dc:creator><pubDate>Thu, 02 Oct 2025 11:11:34 GMT</pubDate><media:content url="https://changelog.testomat.io/content/images/2025/10/preview.png" medium="image"/><content:encoded><![CDATA[<!--kg-card-begin: markdown--><h2 id="java-support-added">Java Support Added</h2>
<img src="https://changelog.testomat.io/content/images/2025/10/preview.png" alt="Java Integration, Telegram Reports, and Jira Without Admins Setup"><p>We&#x2019;ve expanded Testomat.io with <strong>native Java support</strong> &#x1F389;</p>
<p>Previously, you could already use our <a href="https://github.com/testomatio/java-reporter">Java Reporter</a> and <a href="https://github.com/testomatio/java-check-tests">Java Check-Tests</a> CLI tool to import and report Java tests.</p>
<p>With this update, the setup process is even smoother:</p>
<ul>
<li>&#x1F680; Instructions for <strong>importing Java tests</strong> are now available directly in the <strong>application UI</strong>.</li>
<li>&#x1F4DD; Guidance for using the <strong>Java reporter</strong> is also included, so you can easily connect your automated Java tests with Testomat.io.</li>
</ul>
<p>This makes Java integration more straightforward, reducing the need to jump between documentation and your project.</p>
<p><img src="https://changelog.testomat.io/content/images/2025/10/java-ui-1.png" alt="Java Integration, Telegram Reports, and Jira Without Admins Setup" loading="lazy"></p>
<h2 id="easier-jira-integration-without-admin-rights">Easier Jira Integration Without Admin Rights</h2>
<p>We&#x2019;ve simplified how you connect <strong>Jira with Testomat.io</strong>. Now, you can set up the integration <strong>without requiring Jira administrator rights</strong>.</p>
<p>This improvement makes it much easier for teams to get started, especially when:</p>
<ul>
<li>Users don&#x2019;t have admin permissions in Jira</li>
<li>Companies want to avoid lengthy setup approvals</li>
<li>Teams need a faster way to link test management with issue tracking</li>
</ul>
<p>With this change, more users can seamlessly integrate Jira and start benefiting from synchronized workflows between <strong>test cases, runs, and Jira issues</strong> &#x2014; all without waiting for admin access.</p>
<p><img src="https://changelog.testomat.io/content/images/2025/09/jira-no-admin-rights.png" alt="Java Integration, Telegram Reports, and Jira Without Admins Setup" loading="lazy"></p>
<h2 id="run-report-notifications-via-telegram">Run Report Notifications via Telegram</h2>
<p>We&#x2019;ve expanded communication options in Testomat.io &#x2014; now you can send <strong>Run Report notifications directly to Telegram</strong>. This makes it simple to keep your team instantly updated on the status of test runs, without switching tools.</p>
<p>With this update, you can:</p>
<ul>
<li><strong>Customize your notifications</strong>: Use Run Report templates to adjust the content and formatting of messages.</li>
<li><strong>Share with any audience</strong>: Deliver notifications to <strong>Telegram chats, channels, or groups</strong>, depending on where your team collaborates.</li>
<li><strong>Stay aligned in real time</strong>: Ensure distributed teams, stakeholders, or external partners receive updates immediately, keeping everyone on the same page.</li>
</ul>
<p>This integration helps improve visibility and speeds up decision-making by bringing test results directly into the conversations where your team works every day.</p>
<p>Learn more <a href="https://docs.testomat.io/integrations/report-notifications/telegram/#_top">here</a>.</p>
<p><img src="https://changelog.testomat.io/content/images/2025/09/telegram-notification.png" alt="Java Integration, Telegram Reports, and Jira Without Admins Setup" loading="lazy"></p>
<h2 id="filter-manual-and-automated-tests-in-mixed-runs">Filter Manual and Automated Tests in Mixed Runs</h2>
<p>Mixed Runs in Testomat.io allow you to execute <strong>manual and automated tests together within the same run</strong>, combining results into one unified report.</p>
<p>With this update, we&#x2019;ve added a <strong>filter option</strong> that lets you quickly separate <strong>manual</strong> from <strong>automated</strong> tests inside a Mixed Run.</p>
<p>This makes it easier to:</p>
<ul>
<li>Focus only on manual work when needed</li>
<li>Review automated results independently</li>
<li>Navigate large runs with better clarity and control</li>
</ul>
<p><img src="https://changelog.testomat.io/content/images/2025/09/automated-manual-filters.png" alt="Java Integration, Telegram Reports, and Jira Without Admins Setup" loading="lazy"></p>
<h2 id="ai-suggested-tests-from-suite-description">AI-Suggested Tests from Suite Description</h2>
<p>We&#x2019;ve made it even easier to create tests with the help of AI. &#x2728;</p>
<p>Now, you can simply <strong>add a text description to any suite</strong>, and Testomat.io will generate tests for you directly from that description.</p>
<ul>
<li>No additional integrations or setup required.</li>
<li>Quickly turn suite documentation into actionable test cases.</li>
<li>Perfect for jumpstarting test creation or expanding coverage based on functional requirements.</li>
</ul>
<p>This improvement helps you save time and move from <strong>idea &#x2192; documentation &#x2192; tests</strong> in just a few clicks.</p>
<p><img src="https://changelog.testomat.io/content/images/2025/10/tests-from-description.gif" alt="Java Integration, Telegram Reports, and Jira Without Admins Setup" loading="lazy"></p>
<h2 id="change-project-slug-project-id">Change Project Slug (Project ID)</h2>
<p>We&#x2019;ve added the ability to <strong>change a project slug (project ID)</strong> &#x2014; a feature that was not available before.</p>
<p>This option is accessible only to <strong>Project Owners</strong> from the <strong>Company Projects Dashboard</strong>. It gives more flexibility in managing project identifiers, especially if naming conventions or organizational structures change.</p>
<p>&#x26A0;&#xFE0F; <strong>Important Note:</strong><br>
Changing a Project ID will:</p>
<ul>
<li>Affect all existing project links</li>
<li>Potentially disconnect the project from Jira integrations</li>
</ul>
<p>&#x1F449; We recommend updating the Project ID <strong>only when absolutely necessary</strong>.</p>
<h2 id="fixes-and-improvements-%F0%9F%9B%A0%EF%B8%8F">Fixes and Improvements &#x1F6E0;&#xFE0F;</h2>
<ul>
<li>Introduced the ability to delete ongoing runs</li>
<li>Fixed 400 error when creating a bug via YouTrack and ClickUp integration</li>
<li>Improved optimization for uploading stack traces in test run result</li>
<li>Improved linking tests by test plan in Jira plugin</li>
<li>Fixed AI settings update</li>
</ul>
<div style="text-align:center;"> &#x1F680; Create a <a href="https://app.testomat.io">demo project for free</a> and check all amazing features right now. We look forward to <a href="https://testomat.nolt.io">hearing what you think</a>, and what other features we should do!</div><!--kg-card-end: markdown--><h2></h2>]]></content:encoded></item><item><title><![CDATA[Global Analytics Widgets, New Runs Storage & Reporting Updates]]></title><description><![CDATA[Try Testomat.io Agents now! 
New Analytics Widgets, Gauge integration, Run purge strategy, and PDF export for better test management.]]></description><link>https://changelog.testomat.io/global-analytics-widgets-new-runs-storage-reporting-updates/</link><guid isPermaLink="false">689d88939a097604a8531e85</guid><category><![CDATA[cloud]]></category><category><![CDATA[on-premise]]></category><dc:creator><![CDATA[Testomatio Team ]]></dc:creator><pubDate>Wed, 03 Sep 2025 07:20:51 GMT</pubDate><media:content url="https://changelog.testomat.io/content/images/2025/09/preview-0209.png" medium="image"/><content:encoded><![CDATA[<!--kg-card-begin: markdown--><h2 id="dashboard-widgets-for-global-analytics">Dashboard Widgets for Global Analytics</h2>
<img src="https://changelog.testomat.io/content/images/2025/09/preview-0209.png" alt="Global Analytics Widgets, New Runs Storage &amp; Reporting Updates"><p>We&#x2019;re introducing <strong>customizable widgets</strong> for Global Analytics, giving you the power to organize and visualize testing data in the way that best fits your team.</p>
<p>Getting started is simple: navigate to <strong>Dashboard &#x2192; Analytics &#x2192; Dashboards &#x2192; Create your own Dashboard</strong> and choose the widgets you&#x2019;d like to display. You can track key metrics such as <strong>Success Rate by Date, Run Stats, Automation coverage, and Execution trends</strong>.</p>
<p><strong>Why this is useful</strong></p>
<ul>
<li><strong>Personalized view</strong> &#x2013; focus on the metrics that matter most to your project.</li>
<li><strong>Better visibility</strong> &#x2013; instantly understand the health and progress of your testing activities.</li>
<li><strong>Data-driven decisions</strong> &#x2013; spot trends, identify bottlenecks, and track improvements over time.</li>
<li><strong>Team alignment</strong> &#x2013; share a unified view of test results across the organization.</li>
</ul>
<p>With customizable dashboards, your analytics experience becomes more flexible and actionable, helping you drive smarter QA and release decisions.</p>
<p><img src="https://changelog.testomat.io/content/images/2025/09/widgets.gif" alt="Global Analytics Widgets, New Runs Storage &amp; Reporting Updates" loading="lazy"></p>
<h2 id="agents-temporarily-available-for-professional-plan">Agents Temporarily Available for Professional Plan</h2>
<p><strong>Testomat.io Agents</strong> are intelligent assistants that enhance your testing workflow by:</p>
<ul>
<li><strong>Deep Analyze</strong> - Deeply analyzes the project and provides a comprehensive overview of the running status and coverage of tests</li>
<li><strong>Improve Test Descriptions</strong> - Go through all tests with descriptions and improve their markdown formatting. This action won&apos;t change the content, only improves the formatting. Applied changes can be safely reverted in Pulse.</li>
<li><strong>Write Descriptions from Code</strong> - Updates all automated tests which have no description by analyzing their code and writing descriptions.</li>
<li><strong>Mark Flaky Tests</strong> - Adds &quot;Flaky&quot; label to all flaky tests detected by analytics. Update analytics settings to customize flaky detection.</li>
<li><strong>Mark Failed Tests</strong> - Adds &quot;Failed&quot; label to all tests that failed all the time. Analytics will detect tests that failed for last month.</li>
<li><strong>Transform Project to BDD</strong> - Creates a new project with all the tests converted to BDD Gherkin format.</li>
</ul>
<p>These powerful features are normally included in the <strong>Enterprise plan</strong>. However, we&#x2019;re excited to announce that <strong>AI Agents are now temporarily available for Professional plan users</strong>.</p>
<p>This is the perfect opportunity to try them out in your projects and see how AI can optimize your QA process.</p>
<p>&#x1F449; Give them a spin and share your feedback with us in <a href="https://testomatio.slack.com/archives/C013RBWCXJA">our Slack community</a>.</p>
<h2 id="%F0%9F%9A%80-testomatio-reporter-230">&#x1F680; Testomat.io Reporter 2.3.0</h2>
<p>We&#x2019;re excited to introduce a <strong>new version of the Testomat.io Reporter</strong> packed with powerful enhancements that make your reporting more flexible and deeply integrated with your workflow.</p>
<h3 id="%F0%9F%94%96-add-labels-to-tests-via-reporting">&#x1F516; Add Labels to Tests via Reporting</h3>
<p>You can now <strong>add a label to a reported test</strong> directly through the Reporter.<br>
Unlike the <code>meta</code> label, these labels are <strong>persisted to the test case itself</strong>, not just to the reported run.</p>
<ul>
<li>If the label doesn&#x2019;t exist in Testomat.io, it will be automatically created and linked to the test case.</li>
<li>You can also use existing labels or pass a <strong>label value</strong> if the label was created as a custom field.<br>
&#x1F449; <a href="https://github.com/testomatio/reporter/blob/2.x/docs/functions.md#label">Learn more</a></li>
</ul>
<h3 id="%F0%9F%94%97-link-tests-to-a-run-result">&#x1F517; Link Tests to a Run Result</h3>
<p>It&#x2019;s now possible to <strong>link one or multiple tests to the current test in the report</strong>.<br>
This lets you associate multiple test cases with a single execution, ensuring more precise coverage tracking.<br>
&#x1F449; <a href="https://github.com/testomatio/reporter/blob/2.x/docs/functions.md#linktest">See docs</a></p>
<h3 id="%F0%9F%93%9D-link-run-results-to-jira-issues">&#x1F4DD; Link Run Results to Jira Issues</h3>
<p>With Reporter 2.3.0, you can now <strong>link Jira issue IDs to your test report</strong>, creating a direct connection between executed tests and related Jira issues. This tightens the feedback loop between QA and development.<br>
&#x1F449; <a href="https://github.com/testomatio/reporter/blob/2.x/docs/functions.md#linkjira">See docs</a></p>
<h2 id="%F0%9F%97%91%EF%B8%8F-new-runs-deletion-strategy">&#x1F5D1;&#xFE0F; New Runs Deletion Strategy</h2>
<p>Starting September 2025, Testomat.io replaces the old <strong>Delete</strong> option with a safer <strong>Purge</strong> for Runs workflow. Purged Runs are first compressed and moved to the Archive, preserving essential data like test results and artifacts. Runs can be restored from the Archive, and permanent deletion only happens when manually removed from the Archive. Automatic purges follow the same two-step process, keeping your workspace clean while safeguarding important data.</p>
<p>You can find more details in <a href="https://docs.testomat.io/project/runs/managing-runs/#purge-runs">our documentation</a>.</p>
<h2 id="%F0%9F%A4%96-improved-artifacts-support-for-webdriverio">&#x1F916; Improved Artifacts Support for WebdriverIO</h2>
<p>We&#x2019;ve fixed and improved artifacts support for <strong>WebdriverIO (WDIO)</strong>, a popular end-to-end testing framework for web applications. With this update, you can now generate and view artifacts more reliably when running automated tests.</p>
<p>To ensure accurate Wdio reporting, please use <strong>Testomat.io Reporter v2.2.2</strong>.</p>
<h2 id="gauge-framework-support">Gauge Framework Support</h2>
<p>We&#x2019;re excited to announce support for the <strong>Gauge</strong> test automation framework &#x2014; an open-source tool for writing <strong>lightweight, readable, and maintainable</strong> acceptance tests.</p>
<p>With this update, you can:</p>
<ul>
<li>&#x1F4E5; <strong>Import Gauge tests</strong> into Testomat.io and keep them synced</li>
<li>&#x1F511; <strong>Update test IDs</strong> so automated tests match correctly during CI/CD reporting</li>
<li>&#x1F6E0;&#xFE0F; Find all required <strong>commands directly in Testomat.io</strong> for easy setup</li>
<li>&#x1F4CA; <strong>Report Gauge test results</strong> with Testomat.io Reporter.</li>
</ul>
<p>This integration makes it seamless to manage Gauge projects in your existing Testomat.io workflows.</p>
<p><img src="https://changelog.testomat.io/content/images/2025/08/gauge-reporting.png" alt="Global Analytics Widgets, New Runs Storage &amp; Reporting Updates" loading="lazy"></p>
<p><img src="https://changelog.testomat.io/content/images/2025/08/gauge-import.png" alt="Global Analytics Widgets, New Runs Storage &amp; Reporting Updates" loading="lazy"></p>
<h2 id="improved-invite-page">Improved Invite Page</h2>
<p>We&#x2019;ve made the <strong>Invite Page more user-friendly</strong>. Now, when inviting new team members to your company, you can <strong>assign a user role right away</strong>.</p>
<p>This small but impactful improvement streamlines the onboarding process, saves time, and ensures new users get the correct permissions from the start.</p>
<p><img src="https://changelog.testomat.io/content/images/2025/09/invite-page.png" alt="Global Analytics Widgets, New Runs Storage &amp; Reporting Updates" loading="lazy"></p>
<h2 id="export-suite-as-pdf">Export Suite as PDF</h2>
<p>You can now <strong>download any suite as a PDF file</strong> and easily share it with teammates, managers, or stakeholders.</p>
<p>Using this feature is simple:</p>
<ol>
<li>Open the desired suite</li>
<li>Click the <strong>Extra menu button</strong></li>
<li>Select <strong>Export as PDF</strong></li>
</ol>
<p>This makes it convenient to share test content outside of Testomat.io, whether for reviews, documentation, or reporting purposes.</p>
<h2 id="export-insights-as-pdf">Export Insights as PDF</h2>
<p>We&#x2019;ve expanded the <strong>Insights</strong> feature with the ability to export reports as PDF files. Now, instead of keeping your analytics only within Testomat.io, you can easily download and share Insights with teammates, stakeholders, or management.</p>
<p>This makes it simple to:</p>
<ul>
<li>Distribute <strong>test health and coverage reports</strong> during sprint reviews or release planning.</li>
<li>Attach <strong>detailed insights</strong> to documentation or presentations.</li>
<li>Keep a <strong>snapshot of project quality</strong> for historical tracking or audits.</li>
</ul>
<p>With just one click, you can generate a professional, shareable PDF version of your Insights report.</p>
<h2 id="save-tests-as-templates">Save Tests as Templates</h2>
<p>We&#x2019;ve enhanced the <strong>Templates</strong> feature with the ability to save any existing test as a reusable template. When editing a test, you can now select <strong>Save as Template</strong>, and a new template will be created automatically.</p>
<p>This allows you to:</p>
<ul>
<li>Reuse well-structured tests as blueprints for similar scenarios.</li>
<li>Maintain consistency across large test repositories.</li>
<li>Speed up test creation by starting from proven test cases instead of building from scratch.</li>
</ul>
<p>With this update, templates become even more practical &#x2014; helping teams standardize test design and improve productivity.</p>
<p><img src="https://changelog.testomat.io/content/images/2025/09/save-test-as-template.png" alt="Global Analytics Widgets, New Runs Storage &amp; Reporting Updates" loading="lazy"></p>
<h2 id="updated-ui-for-advanced-relaunch">Updated UI for Advanced Relaunch</h2>
<p>The <strong>Advanced Relaunch</strong> option allows teams to restart failed or selected tests from previous runs, ensuring faster recovery and more efficient reruns without starting the entire suite again.</p>
<p>We&#x2019;ve updated its UI to make it even more powerful:</p>
<ul>
<li>Added <strong>new sorting options</strong> to organize tests before relaunch.</li>
<li>Introduced <strong>advanced filtering</strong> to easily target tests in <strong>mixed runs</strong> (manual + automated).</li>
</ul>
<p>This improvement makes it easier to relaunch exactly what you need &#x2014; helping teams save time and keep test cycles lean and focused.</p>
<p><img src="https://changelog.testomat.io/content/images/2025/09/Advanced-relaunch-UI.gif" alt="Global Analytics Widgets, New Runs Storage &amp; Reporting Updates" loading="lazy"></p>
<h2 id="add-text-to-completed-reports">Add Text to Completed Reports</h2>
<p>We&#x2019;ve introduced the option to add text to a report even after the run has been completed. This means you can provide additional context, document conclusions, or highlight key observations once execution is finished. The added text will appear in the summary section of the report, making it easier to keep your team and stakeholders aligned with the most up-to-date insights.</p>
<p><img src="https://changelog.testomat.io/content/images/2025/08/Add-Text-to-Completed-Reports.gif" alt="Global Analytics Widgets, New Runs Storage &amp; Reporting Updates" loading="lazy"></p>
<h2 id="fixes-and-improvements-%F0%9F%9B%A0%EF%B8%8F">Fixes and Improvements &#x1F6E0;&#xFE0F;</h2>
<ul>
<li>Fixed an issue where code templates were missing tag IDs, allowing variables to be set correctly</li>
<li>Fixed incorrect symbols in a test code template</li>
<li>Fixed an issue that prevented creating issues with YouTrack integration</li>
<li>Fixed issues with test runs and manual test data loading, improving overall system reliability</li>
<li>Improved processing of Test Results on Run Report page - optimized artifacts and traces uploading</li>
<li>Added support labels to use a Key:Value format in notification templates</li>
</ul>
<div style="text-align:center;"> &#x1F680; Create a <a href="https://app.testomat.io">demo project for free</a> and check all amazing features right now. We look forward to <a href="https://testomat.nolt.io">hearing what you think</a>, and what other features we should do!</div>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Upgraded Manual Testing, Navigation and AI-Powered Insights]]></title><description><![CDATA[Star favorite suites for quick access, collapse/expand all in large runs, use AI in Testomat.io & Jira, merge tests, export to Markdown, and more updates.]]></description><link>https://changelog.testomat.io/upgraded-manual-testing-navigation-and-ai-insights/</link><guid isPermaLink="false">68879b38d5e17c4af353a634</guid><category><![CDATA[cloud]]></category><category><![CDATA[on-premise]]></category><dc:creator><![CDATA[Testomatio Team ]]></dc:creator><pubDate>Fri, 08 Aug 2025 10:01:02 GMT</pubDate><media:content url="https://changelog.testomat.io/content/images/2025/08/preview.png" medium="image"/><content:encoded><![CDATA[<!--kg-card-begin: markdown--><h2 id="mark-suites-as-favourites-%E2%AD%90">Mark Suites as Favourites &#x2B50;</h2>
<img src="https://changelog.testomat.io/content/images/2025/08/preview.png" alt="Upgraded Manual Testing, Navigation and AI-Powered Insights"><p>Working with large projects often means navigating through a long test tree or running the same search queries over and over again. To make your workflow smoother, we&#x2019;ve introduced the <strong>Starred</strong> feature for test suites.</p>
<p>Now you can:</p>
<ul>
<li>&#x2B50; <strong>Star any suite</strong> you plan to work on &#x2014; whether it&#x2019;s for the current sprint, a specific release, or ongoing maintenance.</li>
<li>&#x1F4C2; <strong>Create your own quick-access list</strong> of suites to eliminate repetitive navigation through the entire test tree.</li>
<li>&#x1F50D; <strong>Mark favourites directly from search results</strong> &#x2014; if you&#x2019;re looking for a test or suite by keyword or using filters you can instantly star it without leaving the search view.</li>
<li>&#x23F1; <strong>Quickly return to starred suites</strong> even after filters or selections are reset &#x2014; simply use the Starred filter to jump back.</li>
<li>&#x1F6E0; <strong>Work more efficiently</strong> by keeping frequently accessed suites at your fingertips, especially in large or multi-team projects.</li>
</ul>
<p>Whether you&#x2019;re focusing on smoke tests, regression packs, or a specific customer scenario, the <strong>Starred</strong> feature ensures you spend less time navigating and more time testing.</p>
<p><img src="https://changelog.testomat.io/content/images/2025/08/starred.gif" alt="Upgraded Manual Testing, Navigation and AI-Powered Insights" loading="lazy"></p>
<h2 id="export-version-control-manual-tests-in-git-markdown-support">Export &amp; Version Control Manual Tests in Git (Markdown Support)</h2>
<p>You can now <strong>export your manual tests as Markdown files directly into your Git repository</strong> and keep them under version control.<br>
This allows your team to <strong>write, edit, and track changes to manual test cases just like code</strong> &#x2014; with full history, branching, and collaboration benefits that Git offers.</p>
<p>What&#x2019;s new:</p>
<ul>
<li>&#x1F4E4; <strong>Export manual tests to Markdown</strong> &#x2013; keep human-readable test cases in your repository.</li>
<li>&#x1F4E5; <strong>Import manual tests via CLI</strong> &#x2013; instantly sync Markdown changes from Git into Testomat.io.</li>
<li>&#x1F504; <strong>Version control for manual tests</strong> &#x2013; review, revert, and track changes over time.</li>
<li>&#x270D;&#xFE0F; <strong>Write and modify in your editor</strong> &#x2013; use your favorite IDE to create or update test cases.</li>
</ul>
<p><strong>User Scenario:</strong><br>
Imagine you&#x2019;re working on a <strong>new product release</strong> and your QA team collaborates closely with developers.<br>
Instead of switching back and forth between Testomat.io and your code editor, the <strong>manual test cases</strong> live right next to your application code in Git.</p>
<ul>
<li>A developer spots a missing step in a test case and updates the Markdown file in their branch.</li>
<li>The change is committed, reviewed in a pull request, and merged into <code>main</code>.</li>
<li>The updated test case is automatically synced back into Testomat.io via CLI.</li>
</ul>
<p>This workflow makes <strong>test case management seamless, collaborative, and fully traceable</strong> &#x2014; aligning your manual testing process with modern DevOps practices.</p>
<p><strong>How to Export Tests?</strong></p>
<p>Run the export command in your project directory to download tests from Testomat.io.<br>
Tests will be created in <strong>Markdown format</strong>.<br>
It&#x2019;s <strong>highly recommended</strong> to use Git Version Control to manage exported tests locally as files.</p>
<p><strong>What Gets Exported?</strong></p>
<ul>
<li>All test suites and tests from your current project.</li>
<li>Test IDs, descriptions, priorities, and structure are preserved in the generated files.</li>
</ul>
<p><img src="https://changelog.testomat.io/content/images/2025/08/export-to-md.gif" alt="Upgraded Manual Testing, Navigation and AI-Powered Insights" loading="lazy"></p>
<h2 id="deep-analyze-agent-%E2%80%93-ai-powered-project-insights">Deep Analyze Agent &#x2013; AI-Powered Project Insights</h2>
<p>Introducing the <strong>Deep Analyze Agent</strong>, an advanced AI tool designed to give you a <strong>comprehensive, data-driven overview</strong> of your project&#x2019;s testing health.<br>
It goes beyond surface-level reporting to deeply assess <strong>running status, test coverage, and release readiness</strong>.</p>
<p>It provides:</p>
<ul>
<li>&#x1F50D; <strong>Full project scan</strong> &#x2013; AI examines the structure, test suites, and execution history.</li>
<li>&#x1F5C2; <strong>Functional area mapping</strong> &#x2013; identifies project areas based on suites and their test coverage.</li>
<li>&#x1F4CA; <strong>Coverage analysis</strong> &#x2013; compares existing tests with execution results to highlight untested or risky areas.</li>
<li>&#x1F4A1; <strong>Actionable insights</strong> &#x2013; provides recommendations for improving coverage, stability, and release readiness.</li>
<li>&#x1F680; <strong>Release readiness check</strong> &#x2013; helps teams make informed go/no-go decisions before deployment.</li>
</ul>
<p><strong>User Scenario:</strong><br>
You&#x2019;re preparing for a <strong>major release</strong> and need a clear picture of your project&#x2019;s testing health.<br>
Instead of manually compiling reports from multiple runs, the <strong>Deep Analyze Agent</strong> automatically:</p>
<ol>
<li>Scans your entire project structure and suite organization.</li>
<li>Maps suites to functional areas of your product.</li>
<li>Compares planned coverage with executed coverage.</li>
<li>Flags untested, low-pass-rate, or high-risk areas.</li>
<li>Summarizes the results into an <strong>easy-to-read dashboard with recommendations</strong>.</li>
</ol>
<p>This means QA leads, PMs, and developers can make <strong>data-backed decisions</strong> faster, with confidence in the quality status of the product.</p>
<p><img src="https://changelog.testomat.io/content/images/2025/08/deep-analyze.png" alt="Upgraded Manual Testing, Navigation and AI-Powered Insights" loading="lazy"></p>
<h2 id="analyze-suite-%E2%80%93-functional-coverage-stability-at-a-glance">Analyze Suite &#x2013; Functional Coverage &amp; Stability at a Glance</h2>
<p>The new <strong>Analyze Suite</strong> tool brings AI-powered analytics directly to individual suites, helping you assess both <strong>functional coverage</strong> and <strong>suite stability</strong> without navigating the entire project view.</p>
<p>What&#x2019;s new:</p>
<ul>
<li>&#x1F5C2; <strong>Functional area coverage mapping</strong> &#x2013; analyzes tests within a suite to determine which parts of your product it covers.</li>
<li>&#x1F4CA; <strong>Suite Stability Report</strong> &#x2013; evaluates recent test execution results to highlight flakiness, instability, or recurring issues.</li>
<li>&#x1F50D; <strong>Focused insight</strong> &#x2013; ideal for monitoring the health of specific product modules or critical flows.</li>
</ul>
<p>By providing actionable insights at the suite level, teams can quickly identify improvement areas, address instability, and maintain high-quality standards in critical parts of their projects.</p>
<p><img src="https://changelog.testomat.io/content/images/2025/08/analyze-suite.png" alt="Upgraded Manual Testing, Navigation and AI-Powered Insights" loading="lazy"></p>
<h2 id="ai-features-in-the-jira-plugin">AI Features in the Jira Plugin</h2>
<p>We&#x2019;ve extended Testomat.io&#x2019;s AI capabilities to the <strong>Jira plugin</strong>! Now you can access powerful AI-assisted options directly within your Jira projects, including:</p>
<ul>
<li><strong>Suggest Tests</strong> &#x2014; Automatically generate relevant test cases.</li>
<li><strong>Suggest Description</strong> &#x2014; Get AI-powered recommendations for detailed and clear issue descriptions.</li>
<li>Other AI-driven enhancements to streamline your test design and issue tracking workflow.</li>
</ul>
<p>This integration helps improve collaboration between development and QA teams by embedding intelligent test suggestions right where your work happens.</p>
<p><img src="https://changelog.testomat.io/content/images/2025/08/AI-for-Jira-Plugin.gif" alt="Upgraded Manual Testing, Navigation and AI-Powered Insights" loading="lazy"></p>
<h2 id="merge-tests-%E2%80%93-simplify-and-clean-your-test-repository">Merge Tests &#x2013; Simplify and Clean Your Test Repository</h2>
<p>Sometimes test repositories accumulate duplicate or overlapping tests that can cause confusion and inefficiency. The new <strong>Merge Tests</strong> feature helps you clean up your test library by combining multiple tests into one while preserving valuable information from all merged tests.</p>
<p><strong>Key benefits:</strong></p>
<ul>
<li>&#x1F9F9; Easily remove duplicate or redundant tests</li>
<li>&#x1F4DD; Consolidate descriptions and important details from all merged tests</li>
<li>&#x1F504; Maintain test history and continuity without losing data</li>
</ul>
<p><strong>How it works:</strong></p>
<ul>
<li>On the <strong>Tests page</strong>, enable <strong>multiselection</strong></li>
<li>Select the tests you want to merge</li>
<li>Click the <strong>extra button</strong> on the bottom menu bar and choose <strong>Merge</strong></li>
<li>Select the <strong>main test</strong> to keep as the base for merging</li>
<li>Review and edit the <strong>suggested combined description</strong></li>
<li>Add any follow-up notes if needed</li>
<li>Confirm by clicking <strong>Update test description</strong></li>
</ul>
<p>This streamlined process makes test maintenance easier, helping teams keep their test repository clean, accurate, and up-to-date.</p>
<p><img src="https://changelog.testomat.io/content/images/2025/08/merge-tests.gif" alt="Upgraded Manual Testing, Navigation and AI-Powered Insights" loading="lazy"></p>
<h2 id="collapse-and-expand-all-suites-in-runs">Collapse and Expand All Suites in Runs</h2>
<p>Managing large test runs with hundreds or thousands of tests can be overwhelming. To make navigation easier, we introduced the ability to <strong>collapse and expand all suites</strong> in the test list.</p>
<p>This feature allows you to quickly hide or show all test suites within a run, helping you focus on specific parts of your test set or get an overview without endless scrolling.</p>
<p>Perfect for runs with 1000+ tests, it saves time and reduces visual clutter, making manual test execution smoother and more efficient.</p>
<p><img src="https://changelog.testomat.io/content/images/2025/08/collapse-all.gif" alt="Upgraded Manual Testing, Navigation and AI-Powered Insights" loading="lazy"></p>
<h2 id="nightwatch-framework-support">Nightwatch Framework Support</h2>
<p>We&#x2019;ve added <strong>Nightwatch</strong> framework support to Testomat.io! Now you can easily import your Nightwatch automated tests and enable detailed reporting within the platform.</p>
<p>Use the latest version of the <code>check-tests</code> and <code>@testomatio/reporter</code> CLI tools to import and run your Nightwatch tests with full integration. This allows you to track test results, analyze trends, and generate reports directly from your Nightwatch test runs.</p>
<p>For detailed instructions on importing and configuring Nightwatch tests, see the <a href="https://docs.testomat.io/project/import-export/import/import-js/">dedicated documentation</a>.</p>
<p><img src="https://changelog.testomat.io/content/images/2025/08/Nightwatch.png" alt="Upgraded Manual Testing, Navigation and AI-Powered Insights" loading="lazy"></p>
<h2 id="fixes-and-improvements-%F0%9F%9B%A0%EF%B8%8F">Fixes and Improvements &#x1F6E0;&#xFE0F;</h2>
<ul>
<li>Fixed RunGroup pagination <a href="https://github.com/testomatio/app/issues/1386">https://github.com/testomatio/app/issues/1386</a></li>
<li>Added Launch a Copy option for Mixed runs <a href="https://github.com/testomatio/app/issues/1389">https://github.com/testomatio/app/issues/1389</a></li>
<li>Save option for Mixed runs <a href="https://github.com/testomatio/app/issues/1388">https://github.com/testomatio/app/issues/1388</a></li>
<li>Fixed test order in PDF report not matching the test order in the run report <a href="https://github.com/testomatio/app/issues/1383">https://github.com/testomatio/app/issues/1383</a></li>
<li>Improved processing of multiselection bulk actions for Manual Run</li>
</ul>
<div style="text-align:center;"> &#x1F680; Create a <a href="https://app.testomat.io">demo project for free</a> and check all amazing features right now. We look forward to <a href="https://testomat.nolt.io">hearing what you think</a>, and what other features we should do!</div>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Improved Mixed Runs, Notification Customization, and Enhanced Analytics Experience]]></title><description><![CDATA[New Testomat.io updates: launch mixed runs without CI, improved templates, customizable Slack/Teams notifications, faster Global Analytics with filtering, and better Jira field handling.]]></description><link>https://changelog.testomat.io/improved-mixed-runs-notification-customization-and-enhanced-analytics-experience/</link><guid isPermaLink="false">686b7c455e430844a65992a9</guid><category><![CDATA[cloud]]></category><category><![CDATA[on-premise]]></category><dc:creator><![CDATA[Testomatio Team ]]></dc:creator><pubDate>Tue, 29 Jul 2025 04:43:31 GMT</pubDate><media:content url="https://changelog.testomat.io/content/images/2025/07/preview.png" medium="image"/><content:encoded><![CDATA[<!--kg-card-begin: markdown--><h2 id="global-analytics-revamped">Global Analytics Revamped</h2>
<img src="https://changelog.testomat.io/content/images/2025/07/preview.png" alt="Improved Mixed Runs, Notification Customization, and Enhanced Analytics Experience"><p>The Global Analytics page just got a major upgrade!</p>
<ul>
<li>&#x1F680; <strong>Faster loading &amp; improved performance</strong> across large data sets</li>
<li>&#x1F9E0; <strong>Smarter UI</strong> with redesigned layout for better readability</li>
<li>&#x1F3AF; <strong>Simplified filtering</strong> by project, tags, status, and more</li>
<li>&#x1F4C8; <strong>New pie-chart diagrams</strong> to instantly visualize test coverage and outcomes</li>
</ul>
<p>Now it&apos;s easier than ever to get a quick view of testing across your entire organization.</p>
<h3 id="global-analytics-now-with-label-custom-field-filters">Global Analytics: Now with Label &amp; Custom Field Filters</h3>
<p>We&#x2019;ve expanded the filtering options in <strong>Global Analytics</strong>:</p>
<ul>
<li>&#x2705; <strong>Filter by labels</strong> to narrow down results by tag (e.g., <code>smoke</code>, <code>regression</code>, <code>critical</code>)</li>
<li>&#x1F6E0;&#xFE0F; <strong>Filter by custom fields</strong> defined in your workspace or test suite</li>
</ul>
<p>These additions make cross-project analysis even more precise and tailored to your team&#x2019;s needs.</p>
<p><img src="https://changelog.testomat.io/content/images/2025/07/label-filter.png" alt="Improved Mixed Runs, Notification Customization, and Enhanced Analytics Experience" loading="lazy"></p>
<h2 id="mixed-runs-without-ci-configuration">Mixed Runs Without CI Configuration</h2>
<p>You can now launch <strong>mixed test runs</strong> (manual + automated) <strong>even if your project isn&#x2019;t connected to a CI system</strong>. Previously, this option was only available for projects with a configured CI pipeline.</p>
<p>With this update, you can:</p>
<ul>
<li>Start a mixed test run that includes both <strong>manual and automated tests</strong>.</li>
<li>Send automated test results <strong>via API or the Testomat.io Reporter</strong>.</li>
<li><strong>Keep full control</strong> of test execution in projects that are managed outside CI/CD tools.</li>
</ul>
<p>&#x1F517; For reporting with <strong>Testomat.io Reporter</strong>, see <a href="https://docs.testomat.io/project/runs/reporter">the documentation</a></p>
<p>&#x1F517; For reporting via <strong>API</strong>, <a href="https://testomatio.github.io/reporter/#/paths/~1api~1reporter~1%7BrunId%7D/put">see the reference</a></p>
<p>This gives teams greater flexibility when working in hybrid environments or when CI integration isn&#x2019;t yet set up.</p>
<p><img src="https://changelog.testomat.io/content/images/2025/07/mixed-no-ci.png" alt="Improved Mixed Runs, Notification Customization, and Enhanced Analytics Experience" loading="lazy"></p>
<h2 id="improved-template-creation-experience">Improved Template Creation Experience</h2>
<p>We&#x2019;ve enhanced the <strong>template creation interface</strong> to make it more intuitive and efficient. The updated UI offers a clearer layout, helping you build and manage templates with ease.</p>
<p>What&#x2019;s new:</p>
<ul>
<li>&#x2705; <strong>Streamlined UI</strong> &#x2013; Clean and simple layout for faster editing</li>
<li>&#x1F524; <strong>More dynamic variables</strong> &#x2013; Easily insert parameters and placeholders to customize content for different test cases or runs</li>
<li>&#x1F50D; <strong>Searchable variables list</strong> &#x2013; Quickly find the variables you need without scrolling</li>
<li>&#x1F9E9; <strong>Better flexibility</strong> &#x2013; Create reusable templates tailored to various teams, workflows, or projects</li>
</ul>
<p>These improvements help diversify your test documentation and boost consistency across test cases, reports, and communication.</p>
<p>See more variables in <a href="https://docs.testomat.io/management/project/templates/#using-variables-in-templates">the documentation</a>.</p>
<p><img src="https://changelog.testomat.io/content/images/2025/07/templates.png" alt="Improved Mixed Runs, Notification Customization, and Enhanced Analytics Experience" loading="lazy"></p>
<h2 id="custom-templates-for-slack-and-ms-teams-notifications">Custom Templates for Slack and MS Teams Notifications</h2>
<p>You can now <strong>customize Slack and Microsoft Teams notifications</strong> using powerful templates. Whether you&apos;re alerting teams about test results, failures, or run completions &#x2014; tailor the message content to your needs with the expanded set of dynamic variables.</p>
<p>Key Highlights:</p>
<ul>
<li>&#x270F;&#xFE0F; <strong>Fully customizable content</strong> &#x2013; Adjust tone, structure, and data shown in notifications</li>
<li>&#x1F501; <strong>Use rich variables</strong> &#x2013; Access test names, run IDs, statuses, links, durations, and more</li>
<li>&#x1F50D; <strong>Searchable variables list</strong> &#x2013; Easily find and insert relevant variables while editing</li>
<li>&#x1F4E2; <strong>Improve clarity and relevance</strong> &#x2013; Ensure your team gets the right info at the right time, in the right format</li>
</ul>
<p>&#x26A0;&#xFE0F; <strong>Note:</strong> If you already have Slack or MS Teams notifications configured, you&#x2019;ll need to <a href="https://docs.testomat.io/integrations/report-notifications/rules/">create a <strong>new Notification Rule</strong> to apply the updated templates.</a></p>
<h2 id="improved-handling-of-jira-fields-versions-components">Improved Handling of Jira Fields: Versions &amp; Components</h2>
<p>We&#x2019;ve enhanced support for <strong>Jira fields</strong>, specifically:</p>
<ul>
<li>&#x1F504; <strong>Versions</strong> &#x2013; improved mapping and synchronization of release versions</li>
<li>&#x1F9F1; <strong>Components</strong> &#x2013; better detection and association with test artifacts</li>
</ul>
<p>These fields are <strong>commonly used during defect creation</strong>, so this improvement ensures more accurate and seamless linking between your Jira issues and test cases &#x2014; making your defect reporting process more efficient and reliable.</p>
<p><img src="https://changelog.testomat.io/content/images/2025/07/comp-ver.png" alt="Improved Mixed Runs, Notification Customization, and Enhanced Analytics Experience" loading="lazy"></p>
<h2 id="rungroup-statistic-report">Rungroup Statistic Report</h2>
<p>Introducing the <strong>Rungroup Statistic Report</strong> &#x2014; a new way to analyze the health and progress of test runs grouped together.</p>
<p>This report includes:</p>
<ul>
<li>&#x2705; <strong>Run Execution Summary</strong> &#x2013; a quick breakdown of passed, failed, and skipped tests across all runs in the group</li>
<li>&#x1F50D; <strong>Detailed Analytics by Run Status</strong> &#x2013; view trends, patterns, and key metrics within each run</li>
<li>&#x1F4A1; <strong>AI-Powered Recommendations</strong> &#x2013; suggested actions to improve stability and address recurring issues</li>
</ul>
<p>Perfect for teams managing large-scale test executions across multiple environments or test types.</p>
<p><img src="https://changelog.testomat.io/content/images/2025/07/run-group-stat.gif" alt="Improved Mixed Runs, Notification Customization, and Enhanced Analytics Experience" loading="lazy"></p>
<h2 id="fixes-and-improvements-%F0%9F%9B%A0%EF%B8%8F">Fixes and Improvements &#x1F6E0;&#xFE0F;</h2>
<ul>
<li>Fixed follow-up messages for AI features</li>
<li>Implemented combined Report to include test runs from the child folders<br>
<a href="https://github.com/testomatio/app/issues/1192">https://github.com/testomatio/app/issues/1192</a></li>
<li>Fixed suite search inconsistent behavior for newly created suites <a href="https://github.com/testomatio/app/issues/1303">https://github.com/testomatio/app/issues/1303</a></li>
<li>Fixed Global analytics filtering by Jira issue <a href="https://github.com/testomatio/app/issues/1377">https://github.com/testomatio/app/issues/1377</a></li>
<li>Improved handling Artifacts for Minio: added Enable if you use Minio FORCE_PATH_STYLE=true setting to Artifacts configuration UI</li>
<li>Fixed filtering by Jira issue for Global analytics <a href="https://github.com/testomatio/app/issues/1377">https://github.com/testomatio/app/issues/1377</a></li>
</ul>
<div style="text-align:center;"> &#x1F680; Create a <a href="https://app.testomat.io">demo project for free</a> and check all amazing features right now. We look forward to <a href="https://testomat.nolt.io">hearing what you think</a>, and what other features we should do!</div>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[From Labels to Reports: Powerful New Tools for Scalable Test Automation]]></title><description><![CDATA[Streamline your QA workflow with AI-powered run reports, custom views, dynamic labeling, and enhanced test Testomat.io Reporter]]></description><link>https://changelog.testomat.io/from-labels-to-reports-powerful-new-tools-for-scalable-test-automation/</link><guid isPermaLink="false">681b57e6ce11371d6bfbefc9</guid><category><![CDATA[cloud]]></category><category><![CDATA[on-premise]]></category><dc:creator><![CDATA[Testomatio Team ]]></dc:creator><pubDate>Tue, 08 Jul 2025 05:20:25 GMT</pubDate><media:content url="https://changelog.testomat.io/content/images/2025/07/CleanShot-2025-07-07-at-22.09.14@2x.png" medium="image"/><content:encoded><![CDATA[<!--kg-card-begin: markdown--><h2 id="testomatio-reporter-v201-released">Testomat.io Reporter v2.0.1 Released</h2>
<img src="https://changelog.testomat.io/content/images/2025/07/CleanShot-2025-07-07-at-22.09.14@2x.png" alt="From Labels to Reports: Powerful New Tools for Scalable Test Automation"><p>We&#x2019;ve rolled out a major update to the <strong>Testomat.io Reporter</strong> library &#x2014; the core utility that sends your automated test results to the Testomat.io platform. This update improves compatibility, performance, and control for engineering teams integrating our reporter into their CI/CD pipelines.</p>
<h3 id="what%E2%80%99s-new">What&#x2019;s New:</h3>
<ul>
<li>
<p><strong>Full ESM Support (with CommonJS fallback)</strong><br>
Works seamlessly with modern JavaScript projects and still supports older setups.</p>
</li>
<li>
<p><strong>Simplified Imports</strong><br>
Cleaner, shorter import paths to make configuration easier.</p>
</li>
<li>
<p><strong>New <code>replay</code> Command</strong><br>
Re-send reports for a test run using a simple command &#x2014; helpful for debugging or audit trails. <em>(Run must be executed with DEBUG pipe enabled.)</em></p>
</li>
<li>
<p><strong>Control File Creation in XML Mode</strong><br>
Use <code>TESTOMATIO_IGNORE_NEW_TESTS</code> to prevent new files from being created during XML imports.</p>
</li>
<li>
<p><strong>New Path Config Options</strong><br>
Specify the root location for test files using <code>TESTOMATIO_SUITE</code> and <code>TESTOMATIO_WORKDIR</code> &#x2014; more flexibility for monorepos and custom setups.</p>
</li>
<li>
<p><strong>XML Import Enhancements</strong><br>
Improved accuracy and compatibility for teams working with JUnit or other XML formats.</p>
</li>
<li>
<p><strong>Faster Network Layer</strong><br>
Switched from <code>axios</code> to the lighter <code>gaxios</code> (built on <code>node-fetch</code>) for leaner HTTP performance.</p>
</li>
<li>
<p><strong>Improved Report Metadata</strong><br>
Added a timestamp column and unique <code>runId</code> for better tracking and traceability in reports.</p>
</li>
</ul>
<p>&#x1F449; For full technical details, visit the release notes: <a href="https://github.com/testomatio/reporter/releases/tag/2.0.1">Testomat.io Reporter v2.0.1 on GitHub</a></p>
<h2 id="ai-powered-project-runs-status-report">AI-Powered Project Runs Status Report</h2>
<p>We&#x2019;ve added a smart new feature that automatically generates a <strong>high-level status report</strong> for your project&#x2019;s test runs &#x2014; powered by AI.</p>
<p>The <strong>Runs Status Report</strong> gives you a quick overview of test stability, critical issues, and performance trends across recent runs. It helps QA teams and stakeholders understand what&#x2019;s working well and where attention is needed &#x2014; without digging through individual test logs.</p>
<p>What&#x2019;s included:</p>
<ul>
<li><strong>Summary Overview</strong> &#x2013; Total test runs, overall pass rate, trends, and key action items.</li>
<li><strong>Area-Specific Stability</strong> &#x2013; Performance insights grouped by feature areas (e.g. subscriptions, user roles, etc.).</li>
<li><strong>Flaky &amp; Failed Tests</strong> &#x2013; Highlights of recurring issues or flaky behavior with potential risk.</li>
<li><strong>Execution Time Trends</strong> &#x2013; How test durations are behaving over time.</li>
<li><strong>Top Errors</strong> &#x2013; Most frequent failure messages to help speed up debugging.</li>
<li><strong>Systematic Failures</strong> &#x2013; Pinpointed test cases that failed consistently and may block critical flows.</li>
</ul>
<p>This report is available automatically based on recent test run history, giving your team <strong>instant visibility</strong> into the health of your project.</p>
<h2 id="apply-labels-to-imported-tests">Apply Labels to Imported Tests</h2>
<p>Now you can <strong>automatically organize your tests</strong> by adding labels to them at the moment they are imported into Testomat.io. Labels act like <strong>tags or categories</strong> that help you group and filter your tests more easily &#x2014; for example, by test type, area of functionality, or team ownership.</p>
<p>What this means in plain language:</p>
<p>When a developer imports automated tests into Testomat.io, they can now <strong>attach labels in bulk</strong>. This saves time and ensures tests are properly organized from the beginning &#x2014; no need to label them one by one later.</p>
<p>Example Use Cases:</p>
<ul>
<li>Group all tests for a login feature with <code>feature:auth</code></li>
<li>Mark high-priority tests with <code>severity:high</code></li>
<li>Separate backend and frontend tests with <code>team:backend</code>, <code>team:frontend</code></li>
<li>Identify core tests for quick runs with <code>smoke</code></li>
</ul>
<p>Once imported, you can <strong>easily filter and find tests</strong> in the UI using these labels &#x2014; improving test planning, execution, and reporting.</p>
<p>The feature is available in <code>check-tests@0.11.1</code> and <code>check-cucumber@0.6.0</code>.<br>
Find more details for <a href="https://github.com/testomatio/check-tests/blob/master/README.md#apply-labels-to-tests">check-tests</a> and <a href="https://github.com/testomatio/check-cucumber/blob/master/README.md#apply-labels-to-tests">check-cucumber</a></p>
<h2 id="custom-view-for-run-reports">Custom View for Run Reports</h2>
<p>You can now <strong>customize the table view inside any Run Report</strong>, just like on the <strong>Runs</strong> and <strong>Run Groups</strong> pages.</p>
<p>This allows you to:</p>
<ul>
<li>Show or hide columns such as test status, duration, labels, tags, priority, etc.</li>
<li>Focus only on the <strong>data that matters most</strong> to you or your team</li>
<li>Tailor the report layout for different review needs</li>
</ul>
<p>How to use:</p>
<ul>
<li>Click the <strong>&#x201C;Custom view&#x201D;</strong> button to instantly open the configurable view</li>
<li>Adjust <strong>Run list settings</strong>, including the <strong>width of individual columns</strong> to suit your preferences</li>
<li>These settings are <strong>saved automatically</strong> and will apply to all Run Reports until changed again</li>
</ul>
<p><img src="https://changelog.testomat.io/content/images/2025/07/custom-view-report.gif" alt="From Labels to Reports: Powerful New Tools for Scalable Test Automation" loading="lazy"></p>
<h2 id="add-tests-to-ongoing-run-from-the-tests-page">Add Tests to Ongoing Run from the Tests Page</h2>
<p>You can now <strong>add tests directly to an ongoing test run</strong> right from the <strong>Tests page</strong> &#x2014; making test management more dynamic and efficient.</p>
<h3 id="%F0%9F%94%8D-how-it-works">&#x1F50D; How It Works:</h3>
<ol>
<li>Go to the <strong>Tests</strong> page.</li>
<li>Enable <strong>multiselection</strong>.</li>
<li>Use filters or checkboxes to select individual tests or entire suites.</li>
<li>In the bottom bar, click the <strong>Extra</strong> (&#x2022;&#x2022;&#x2022;) button.</li>
<li>Select <strong>Add to Run</strong>.</li>
<li>Choose the ongoing run from the list or use the search bar.</li>
<li>Confirm your selection &#x2014; and you&apos;re done!</li>
</ol>
<p>&#x2705; The selected tests will be immediately added to the active test run, helping you keep your execution plan up-to-date even while a run is already in progress.</p>
<p><img src="https://changelog.testomat.io/content/images/2025/07/add-to-run.gif" alt="From Labels to Reports: Powerful New Tools for Scalable Test Automation" loading="lazy"></p>
<h2 id="simplified-top-bar-on-the-runs-page">Simplified Top Bar on the Runs Page</h2>
<p>We&apos;ve cleaned up the <strong>top bar on the Runs page</strong> to make the interface more streamlined and focused.<br>
But don&#x2019;t worry &#x2014; you still have access to all key actions!</p>
<p><strong>What changed:</strong></p>
<ul>
<li>The buttons for <strong>New Manual Run</strong>, <strong>Run Group</strong>, and <strong>CI Launch</strong> are now grouped under a single <strong>&#x201C;Action&#x201D;</strong> button in the top-right corner.</li>
<li>This change reduces visual clutter while keeping all important actions just one click away.</li>
</ul>
<p><img src="https://changelog.testomat.io/content/images/2025/07/CleanShot-2025-07-07-at-21.29.21@2x.png" alt="From Labels to Reports: Powerful New Tools for Scalable Test Automation" loading="lazy"></p>
<h2 id="fixes-and-improvements-%F0%9F%9B%A0%EF%B8%8F">Fixes and Improvements &#x1F6E0;&#xFE0F;</h2>
<ul>
<li>Improved multiple files uploading, preventing file loss <a href="https://github.com/testomatio/app/issues/1215">https://github.com/testomatio/app/issues/1215</a></li>
<li>Improved search for tests, returning results with matching test descriptions, both for classical and BDD projects</li>
<li>Fixed issue with loading artifacts in Run Report Custom view <a href="https://github.com/testomatio/app/issues/1375">https://github.com/testomatio/app/issues/1375</a></li>
<li>Fixed editing test title issue - when editing title with the double click option</li>
</ul>
<div style="text-align:center;"> &#x1F680; Create a <a href="https://app.testomat.io">demo project for free</a> and check all amazing features right now. We look forward to <a href="https://testomat.nolt.io">hearing what you think</a>, and what other features we should do!</div>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Clean Test Tree, Flexible Runs & Parameterized Tests Made Easy]]></title><description><![CDATA[Work smarter: render dynamic parameters in steps, clean up test tree, track changes in Test Runs]]></description><link>https://changelog.testomat.io/clean-test-tree-flexible-runs-parameterized-tests-made-easy/</link><guid isPermaLink="false">685e77295e430844a65991f0</guid><category><![CDATA[cloud]]></category><category><![CDATA[on-premise]]></category><dc:creator><![CDATA[Testomatio Team ]]></dc:creator><pubDate>Sat, 28 Jun 2025 20:47:32 GMT</pubDate><media:content url="https://changelog.testomat.io/content/images/2025/06/preview.png" medium="image"/><content:encoded><![CDATA[<!--kg-card-begin: markdown--><h2 id="simplified-test-tree-view-hide-tests-in-tree">Simplified Test Tree View: Hide Tests in Tree</h2>
<img src="https://changelog.testomat.io/content/images/2025/06/preview.png" alt="Clean Test Tree, Flexible Runs &amp; Parameterized Tests Made Easy"><p>For large projects, the test tree can get overwhelming. We&#x2019;ve added a <strong>&quot;Hide tests in tree&quot;</strong> option on the <strong>Tests</strong> page to help you keep things tidy. When enabled, expanding folders or suites will only show their structure &#x2014; <strong>without displaying individual tests</strong>.</p>
<p>This improves navigation clarity and speeds up browsing for users managing extensive test repositories.</p>
<p>To enable this mode:<br>
<strong>Go to the Tests page &#x2192; click the Display button &#x2192; enable &quot;Hide tests in tree&quot;</strong></p>
<p><strong>Benefits:</strong></p>
<ul>
<li>Quickly navigate large or deeply nested test structures</li>
<li>Reduce visual clutter when focusing on suites and folders</li>
<li>Improve performance and responsiveness of the tree view</li>
</ul>
<p><img src="https://changelog.testomat.io/content/images/2025/06/hide-tests-in-a-tree.png" alt="Clean Test Tree, Flexible Runs &amp; Parameterized Tests Made Easy" loading="lazy"></p>
<h2 id="dynamic-parameter-rendering-in-test-steps">Dynamic Parameter Rendering in Test Steps</h2>
<p>You can now reference <strong>parameter values directly in test steps and expected results</strong> of parameterized test cases using <code>{{variable}}</code> syntax. This allows test steps to dynamically reflect the input data for each test iteration &#x2014; improving maintainability and clarity.</p>
<p><strong>Example:</strong></p>
<ul>
<li><strong>Step:</strong> Enter username <code>{{username}}</code> and password <code>{{password}}</code></li>
<li><strong>Expected result:</strong> User <code>{{username}}</code> is logged in successfully</li>
</ul>
<p><strong>Use cases:</strong></p>
<ul>
<li>Reusable data-driven test cases for login, forms, or bulk workflows</li>
<li>Reduced duplication by avoiding hardcoded values</li>
<li>Easier review and execution with clear step context per data set</li>
</ul>
<p><img src="https://changelog.testomat.io/content/images/2025/06/params-upd.png" alt="Clean Test Tree, Flexible Runs &amp; Parameterized Tests Made Easy" loading="lazy"></p>
<h2 id="track-changes-in-ongoing-test-runs">Track Changes in Ongoing Test Runs</h2>
<p>We&#x2019;ve added a <strong>Configuration Details</strong> view to help you track changes made during an active test run. This includes updates like <strong>added or removed tests</strong> &#x2014; giving your team better visibility and consistency when managing evolving test scopes.</p>
<p>To access this view: Open an ongoing run &#x2192; click the Edit button &#x2192; go to Details</p>
<p><strong>Benefits:</strong></p>
<ul>
<li>Maintain traceability when test scope changes during execution</li>
<li>Ensure test results reflect all modifications made mid-run</li>
<li>Support audit and review processes in dynamic test cycles</li>
</ul>
<p><img src="https://changelog.testomat.io/content/images/2025/06/Configuration-details.png" alt="Clean Test Tree, Flexible Runs &amp; Parameterized Tests Made Easy" loading="lazy"></p>
<h2 id="fixes-and-improvements-%F0%9F%9B%A0%EF%B8%8F">Fixes and Improvements &#x1F6E0;&#xFE0F;</h2>
<ul>
<li>Optimized custom fields rendering: truncating custom field title ans showing more value</li>
<li>Improved project management: restricted project settings for non-management users</li>
<li>Fixed parameters rendering in mail notifications <a href="https://github.com/testomatio/app/issues/1232">https://github.com/testomatio/app/issues/1232</a></li>
<li>Optimized Global Analytics uploading: better loading results</li>
<li>Fixed missing Feature code when using <code>TESTOMATIO_PREPEND_DIR</code> option <a href="https://github.com/testomatio/app/issues/1352">https://github.com/testomatio/app/issues/1352</a></li>
<li>Polished styles for Company members page</li>
<li>Improved formatting in Comments text area</li>
<li>Fixed assigning users in manual runs <a href="https://github.com/testomatio/app/issues/1355">https://github.com/testomatio/app/issues/1355</a></li>
<li>Optimized processing requirements <a href="https://github.com/testomatio/app/issues/1358">https://github.com/testomatio/app/issues/1358</a></li>
<li>Improved SCIM integration with correct namespace and eager-load compatibility</li>
<li>Fixed an issue with AI BDD descriptions, removing code duplication</li>
</ul>
<div style="text-align:center;"> &#x1F680; Create a <a href="https://app.testomat.io">demo project for free</a> and check all amazing features right now. We look forward to <a href="https://testomat.nolt.io">hearing what you think</a>, and what other features we should do!</div><!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Smarter Testing with AI Agents, Run Insights & Better Organization]]></title><description><![CDATA[Boost your QA workflow with new AI Agents, smarter test insights, flaky test detection, folder-level analysis, and advanced drag-n-drop organization.]]></description><link>https://changelog.testomat.io/smarter-testing-with-ai-agents-run-insights-better-organization/</link><guid isPermaLink="false">68513d96609cbf3fe7327538</guid><category><![CDATA[cloud]]></category><category><![CDATA[on-premise]]></category><dc:creator><![CDATA[Testomatio Team ]]></dc:creator><pubDate>Wed, 18 Jun 2025 16:13:31 GMT</pubDate><media:content url="https://changelog.testomat.io/content/images/2025/06/CleanShot-2025-06-18-at-18.04.57@2x.png" medium="image"/><content:encoded><![CDATA[<!--kg-card-begin: markdown--><h2 id="ai-agent-mark-flaky-tests">AI Agent: Mark Flaky Tests</h2>
<img src="https://changelog.testomat.io/content/images/2025/06/CleanShot-2025-06-18-at-18.04.57@2x.png" alt="Smarter Testing with AI Agents, Run Insights &amp; Better Organization"><p>Our new <strong>AI-powered agent</strong> automatically detects flaky tests and assigns them a <strong>&quot;Flaky&quot;</strong> label based on test execution analytics. This helps teams quickly identify unstable tests that require attention and improves visibility into test reliability across projects.</p>
<p>You can customize flaky detection thresholds and conditions in your project&#x2019;s <strong>Analytics Settings</strong> for greater control and accuracy.</p>
<p><strong>Use cases:</strong></p>
<ul>
<li><strong>Quickly isolate unreliable tests</strong> that frequently fail intermittently and impact CI pipelines.</li>
<li><strong>Improve test quality over time</strong> by tracking and addressing flaky tests more systematically.</li>
<li><strong>Streamline triage processes</strong> by flagging flaky tests with a consistent label used across dashboards and reports.</li>
</ul>
<p><img src="https://changelog.testomat.io/content/images/2025/06/flaky.gif" alt="Smarter Testing with AI Agents, Run Insights &amp; Better Organization" loading="lazy"></p>
<h2 id="ai-agent-mark-failed-tests">AI Agent: Mark Failed Tests</h2>
<p>This <strong>AI-driven agent</strong> automatically identifies consistently failing tests and assigns them a <strong>&quot;Fail&quot;</strong> label. It analyzes the last month of test execution data to detect tests that have failed 100% of the time, helping teams focus on the most critical issues first.</p>
<p>You can review and act on these labeled tests directly from analytics or test repository views.</p>
<p><strong>Use cases:</strong></p>
<ul>
<li><strong>Prioritize test fixes</strong> by highlighting tests that fail consistently and block releases.</li>
<li><strong>Improve test suite stability</strong> by quickly spotting long-term failures across large projects.</li>
<li><strong>Enable smart filtering</strong> in dashboards or test plans using the &quot;Failed&quot; label.</li>
</ul>
<p><img src="https://changelog.testomat.io/content/images/2025/06/fail.gif" alt="Smarter Testing with AI Agents, Run Insights &amp; Better Organization" loading="lazy"></p>
<h2 id="chat-with-tests-for-a-folder"><strong>Chat with Tests for a Folder</strong></h2>
<p>You can now use <strong>Chat with Tests</strong> to analyze the content of any specific folder within your project. The AI will examine all nested suites and tests under the selected folder, providing targeted insights, summaries, and test suggestions for that specific section of your test repository.</p>
<p>This enables more focused analysis and planning when working with large or modular projects.</p>
<p><img src="https://changelog.testomat.io/content/images/2025/06/chat-w-test-folder.gif" alt="Smarter Testing with AI Agents, Run Insights &amp; Better Organization" loading="lazy"></p>
<h2 id="new-pre-configured-prompts-for-chat-with-tests">New Pre-Configured Prompts for Chat with Tests</h2>
<p>We&apos;ve expanded the <strong>Chat with Tests</strong> AI feature with a set of pre-configured prompts to speed up your work and get instant insights from your test base. The new options include:</p>
<ul>
<li><strong>Summarize this project</strong> &#x2013; Quickly generate a high-level overview of the project, including tested areas, suite structure and understand the scope of each feature and its corresponding test coverage.</li>
<li><strong>Suggest test cases for a specific feature</strong> &#x2013; Instantly generate relevant test cases based on a given feature name or description.</li>
<li><strong>Create a plan with 30 test cases for testing</strong> &#x2013; Let the AI build a structured test plan with a detailed list of test cases tailored to your context.</li>
</ul>
<p>These enhancements turn Chat with Tests into a smart assistant for planning, onboarding, and test gap analysis.</p>
<p><img src="https://changelog.testomat.io/content/images/2025/06/CleanShot-2025-06-18-at-17.17.18@2x.png" alt="Smarter Testing with AI Agents, Run Insights &amp; Better Organization" loading="lazy"></p>
<h2 id="run-summary-overview">Run Summary Overview</h2>
<p>We&#x2019;ve introduced a <strong>Run Summary</strong> feature that provides a concise overview for every completed test run. This summary appears automatically once the run is finished and includes key details such as status breakdown, test duration, and overall test outcomes &#x2014; helping teams quickly assess run results at a glance.</p>
<p><img src="https://changelog.testomat.io/content/images/2025/06/run-summary.gif" alt="Smarter Testing with AI Agents, Run Insights &amp; Better Organization" loading="lazy"></p>
<h2 id="expanded-drag-n-drop-for-tests-suites-and-folders">Expanded Drag-n-Drop for Tests, Suites, and Folders</h2>
<p>We&#x2019;ve enhanced the drag-n-drop functionality to make organizing your test structure more intuitive and efficient. You can now:</p>
<ul>
<li><strong>Reorder tests</strong> directly within a suite using drag-n-drop in the side view.</li>
<li><strong>Move tests between suites</strong> by dragging them from an opened suite and dropping them into another suite in the project tree on the left panel.</li>
</ul>
<p>This update streamlines test suite management, especially in larger projects with complex structures.</p>
<p><img src="https://changelog.testomat.io/content/images/2025/06/dnd-improvements.gif" alt="Smarter Testing with AI Agents, Run Insights &amp; Better Organization" loading="lazy"></p>
<h2 id="fixes-and-improvements-%F0%9F%9B%A0%EF%B8%8F">Fixes and Improvements &#x1F6E0;&#xFE0F;</h2>
<ul>
<li>Optimized data processing for Requirements</li>
<li>Improved 0Auth handling for Jira</li>
<li>Improved UI for Company members page - updated styles</li>
<li>Fixed test counter update after test Plan editing - no page refresh needed</li>
<li>Improved operator <code>IN</code> for TQL - no errors when searching IN <code>issue</code>, <code>state</code>, <code>status</code>,<code>created_by</code>, etc</li>
<li>Improved handling tests structure in manual run after adding/deleting tests and users</li>
<li>Added <code>substatus</code> parameter to the reporter <a href="https://github.com/testomatio/reporter/issues/541">https://github.com/testomatio/reporter/issues/541</a></li>
</ul>
<div style="text-align:center;"> &#x1F680; Create a <a href="https://app.testomat.io">demo project for free</a> and check all amazing features right now. We look forward to <a href="https://testomat.nolt.io">hearing what you think</a>, and what other features we should do!</div>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[AI Agents, Azure OpenAI, SCIM, Bulk User Management]]></title><description><![CDATA[Discover AI Agents for test automation, Azure OpenAI integration, SCIM for Enterprise, bulk user management, and Chat with Tests in the latest Testomat.io update]]></description><link>https://changelog.testomat.io/ai-agents-azure-openai-scim-bulk-user-management/</link><guid isPermaLink="false">683ec45e251da453c9927eed</guid><category><![CDATA[cloud]]></category><category><![CDATA[on-premise]]></category><dc:creator><![CDATA[Testomatio Team ]]></dc:creator><pubDate>Wed, 04 Jun 2025 04:56:14 GMT</pubDate><media:content url="https://changelog.testomat.io/content/images/2025/06/preveiw2.png" medium="image"/><content:encoded><![CDATA[<!--kg-card-begin: markdown--><h1 id="ai-agent-write-descriptions-from-code">AI Agent: Write Descriptions from Code</h1>
<img src="https://changelog.testomat.io/content/images/2025/06/preveiw2.png" alt="AI Agents, Azure OpenAI, SCIM, Bulk User Management"><p>We&#x2019;ve introduced a new <strong>AI-powered feature</strong> that automatically generates descriptions for automated tests that currently lack them. The AI agent analyzes each test&#x2019;s code and creates meaningful descriptions to improve readability and documentation across your project.</p>
<p>This significantly reduces the manual effort needed to document existing tests and helps teams quickly understand test coverage without digging into implementation details.</p>
<p><strong>Use cases:</strong></p>
<ul>
<li><strong>Improving test documentation at scale:</strong> Automatically populate missing descriptions in large test repositories.</li>
<li><strong>Onboarding new team members:</strong> New testers or developers can better understand test purposes without reading through code.</li>
<li><strong>Maintaining traceability:</strong> Well-described tests improve coverage visibility and make it easier to link tests to requirements or defects.</li>
</ul>
<p><img src="https://changelog.testomat.io/content/images/2025/06/desc-from-code-agent.gif" alt="AI Agents, Azure OpenAI, SCIM, Bulk User Management" loading="lazy"></p>
<h1 id="ai-agent-transform-project-to-bdd">AI Agent: Transform Project to BDD</h1>
<p>With this new feature, you can now <strong>automatically convert an entire project to BDD</strong> using Gherkin syntax. The AI agent analyzes your existing automated tests and creates a new project with the same tests written in the BDD format.</p>
<p>This allows teams to adopt behavior-driven development practices without manually rewriting tests, saving significant time and ensuring consistency.</p>
<p><strong>Use cases:</strong></p>
<ul>
<li><strong>Migrating to BDD workflows:</strong> Quickly shift to Gherkin-based testing without reauthoring test cases from scratch.</li>
<li><strong>Improving collaboration with non-technical stakeholders:</strong> Make tests more readable and accessible to product owners or business analysts.</li>
<li><strong>Standardizing test style across teams:</strong> Align legacy or mixed-format projects to a single, consistent BDD structure.</li>
</ul>
<p><img src="https://changelog.testomat.io/content/images/2025/06/class-to-bdd-agent.gif" alt="AI Agents, Azure OpenAI, SCIM, Bulk User Management" loading="lazy"></p>
<h1 id="confluence-as-a-requirement-source">Confluence as a Requirement Source</h1>
<p>You can now <strong>link your Confluence space to a Testomat.io project</strong> and use it as a source of requirements. Once connected, the system can analyze your Confluence pages to extract requirement descriptions, assess traceability, identify edge cases, and generate relevant test suites and test cases.</p>
<p>This integration bridges the gap between documentation and test planning, enabling seamless test coverage based on the requirements your teams already maintain.</p>
<p><img src="https://changelog.testomat.io/content/images/2025/06/confluence-req.gif" alt="AI Agents, Azure OpenAI, SCIM, Bulk User Management" loading="lazy"></p>
<h1 id="ai-powered-chat-with-tests">AI-Powered Chat with Tests</h1>
<p>We&#x2019;ve reintroduced the <strong>Chat with Tests</strong> feature &#x2014; an AI-powered assistant that allows you to <strong>ask questions about existing tests</strong> in your project. The AI analyzes your test repository and responds with insights, summaries, or clarifications based on the actual test content.</p>
<p>This interactive capability makes it easier to explore, understand, and manage large sets of tests without manually browsing through them.</p>
<p><img src="https://changelog.testomat.io/content/images/2025/06/chat-w-test.gif" alt="AI Agents, Azure OpenAI, SCIM, Bulk User Management" loading="lazy"></p>
<h1 id="azure-openai-integration">Azure OpenAI Integration</h1>
<p>You can now <strong>connect your Azure OpenAI account</strong> to power AI-driven features in Testomat.io using your own infrastructure. This integration enables your team to leverage advanced AI capabilities while staying compliant with internal security and data policies.</p>
<p>With Azure OpenAI connected, you can continue using Testomat.io&apos;s intelligent features, including:</p>
<ul>
<li>Generating test descriptions from code</li>
<li>Suggesting test cases based on requirements</li>
<li>Analyzing Confluence or Jira content for test coverage</li>
<li>Explaining automation errors based on test execution data</li>
</ul>
<p><strong>Use cases:</strong></p>
<ul>
<li><strong>Enterprise-grade AI integration:</strong> Maintain control over data flow and comply with company-specific cloud policies.</li>
<li><strong>Secure, scalable AI usage:</strong> Run AI-driven features like requirement analysis and test generation through your Azure infrastructure.</li>
<li><strong>Custom AI behavior:</strong> Tailor AI responses and manage usage across teams with Azure monitoring and configuration options.</li>
</ul>
<p><img src="https://changelog.testomat.io/content/images/2025/06/CleanShot-2025-06-03-at-19.37.02@2x.png" alt="AI Agents, Azure OpenAI, SCIM, Bulk User Management" loading="lazy"></p>
<h1 id="scim-integration-enterprise">SCIM Integration (Enterprise)</h1>
<p>We&#x2019;ve added <strong>SCIM (System for Cross-domain Identity Management) integration</strong>, available for <strong>Enterprise clients</strong>. SCIM enables automated provisioning and de-provisioning of users and teams by syncing identity data from your identity provider. Now it&apos;s available for Okta.</p>
<p>This helps enterprise teams streamline user management, reduce manual work, and maintain secure access control at scale.</p>
<p><strong>Use cases:</strong></p>
<ul>
<li><strong>Automated user lifecycle management:</strong> Add or remove users in Testomat.io automatically based on changes in your identity provider.</li>
<li><strong>Simplified onboarding and offboarding:</strong> Ensure new team members are assigned to the right projects and roles without manual configuration.</li>
<li><strong>Improved access governance:</strong> Keep your Testomat.io environment compliant and up to date with organizational structure.</li>
</ul>
<p><strong>Need help setting it up?</strong><br>
Contact our support team to enable SCIM and get assistance with integration.</p>
<p><img src="https://changelog.testomat.io/content/images/2025/06/CleanShot-2025-06-03-at-20.05.18@2x.png" alt="AI Agents, Azure OpenAI, SCIM, Bulk User Management" loading="lazy"></p>
<h1 id="bulk-user-management">Bulk User Management</h1>
<p>Managing users at scale just got easier. On the <strong>Company Members</strong> page, you can now <strong>select multiple users</strong> and apply actions in bulk. This streamlines the process of updating access, roles, or project assignments across large teams.</p>
<p><strong>Benefits:</strong></p>
<ul>
<li><strong>Mass role updates:</strong> Quickly promote multiple users to QA or downgrade access to Read-only with a single action.</li>
<li><strong>Efficient project onboarding:</strong> Assign multiple users to specific projects without repetitive manual steps.</li>
<li><strong>Simplified team management:</strong> Easily deactivate or reassign users in response to organizational changes.</li>
</ul>
<p>This feature is especially useful for growing teams or enterprise accounts managing a large user base.<br>
<img src="https://changelog.testomat.io/content/images/2025/06/mass-man-users.gif" alt="AI Agents, Azure OpenAI, SCIM, Bulk User Management" loading="lazy"></p>
<h2 id="fixes-and-improvements-%F0%9F%9B%A0%EF%B8%8F">Fixes and Improvements &#x1F6E0;&#xFE0F;</h2>
<ul>
<li>Published fix for WebdriverIO artifacts - use testomatio reporter 1.6.17 version</li>
<li>Fixed publishing username when linking test or issues to Linear <a href="https://github.com/testomatio/app/issues/1341">https://github.com/testomatio/app/issues/1341</a></li>
</ul>
<div style="text-align:center;"> &#x1F680; Create a <a href="https://app.testomat.io">demo project for free</a> and check all amazing features right now. We look forward to <a href="https://testomat.nolt.io">hearing what you think</a>, and what other features we should do!</div>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Test Comments, Relaunch Updates & UI Improvements]]></title><description><![CDATA[Add comments to tests, view updated status icons, relaunch manual runs, and edit steps even without a description—improving test workflow clarity.]]></description><link>https://changelog.testomat.io/test-comments-relaunch-updates-ui-improvements/</link><guid isPermaLink="false">6826170cce11371d6bfbf03e</guid><category><![CDATA[cloud]]></category><category><![CDATA[on-premise]]></category><dc:creator><![CDATA[Testomatio Team ]]></dc:creator><pubDate>Fri, 16 May 2025 05:13:12 GMT</pubDate><media:content url="https://changelog.testomat.io/content/images/2025/05/preview11.png" medium="image"/><content:encoded><![CDATA[<!--kg-card-begin: markdown--><h2 id="%F0%9F%92%AC-comments-for-test-cases">&#x1F4AC; Comments for Test Cases</h2>
<img src="https://changelog.testomat.io/content/images/2025/05/preview11.png" alt="Test Comments, Relaunch Updates &amp; UI Improvements"><p>You can now add comments directly to individual test cases. This makes it easy to discuss details with your team, share context, and decide what actions to take next&#x2014;whether to fix, update, or remove a test. Great for collaboration around flaky tests, unexpected failures, or test ownership questions.</p>
<p><strong>Common use cases:</strong></p>
<ul>
<li>Clarify why a test is failing (e.g., due to a known bug or environment issue)</li>
<li>Ask a teammate for input on expected behavior</li>
<li>Link to a related ticket or external discussion</li>
<li>Leave a note about test setup or edge cases for future reference</li>
</ul>
<p><img src="https://changelog.testomat.io/content/images/2025/05/Comments.gif" alt="Test Comments, Relaunch Updates &amp; UI Improvements" loading="lazy"></p>
<h2 id="%F0%9F%8E%AF-improved-test-status-icons">&#x1F3AF; Improved Test Status Icons</h2>
<p>We&#x2019;ve updated the icons that indicate a test&#x2019;s status &#x2014; whether it&apos;s <strong>manual</strong>, <strong>automated</strong>, or <strong>detached</strong>. The icons are now more consistent and clearly styled.<br>
They&#x2019;ve also been repositioned to appear <strong>right before the test title</strong>, making it easier to scan and identify the status at a glance.</p>
<p><img src="https://changelog.testomat.io/content/images/2025/05/icons.png" alt="Test Comments, Relaunch Updates &amp; UI Improvements" loading="lazy"></p>
<h2 id="%F0%9F%94%81-advanced-relaunch-for-manual-and-mixed-runs">&#x1F501; Advanced Relaunch for Manual and Mixed Runs</h2>
<p>Advanced relaunch is now available not only for automated runs, but also for <strong>manual</strong> and <strong>mixed</strong> runs.<br>
This allows you to selectively relaunch failed or specific tests within any type of run, giving you more control and flexibility&#x2014;no matter how the test execution was initiated.</p>
<h2 id="%E2%9C%8F%EF%B8%8F-edit-steps-available-without-description">&#x270F;&#xFE0F; Edit Steps Available Without Description</h2>
<p>You can now use the <strong>Edit Steps</strong> feature even if a test case doesn&apos;t have a description yet.<br>
Previously, this option was only accessible when the <em>&#x201C;Steps&#x201D;</em> heading was present in the description. Now, it&apos;s available directly from the <strong>extra menu</strong>, making it easier to add or update steps at any point in your workflow.</p>
<p><img src="https://changelog.testomat.io/content/images/2025/05/edit-steps.gif" alt="Test Comments, Relaunch Updates &amp; UI Improvements" loading="lazy"></p>
<h2 id="%F0%9F%9B%A0%EF%B8%8F-fixes-and-improvements-%F0%9F%9B%A0%EF%B8%8F">&#x1F6E0;&#xFE0F; Fixes and Improvements &#x1F6E0;&#xFE0F;</h2>
<ul>
<li>Fixed rendering unordered list child items in markdown editor  <a href="https://github.com/testomatio/app/issues/1098">https://github.com/testomatio/app/issues/1098</a></li>
<li>Fixed test content disappearance without saving in Block-based editor <a href="https://github.com/testomatio/app/issues/1317">https://github.com/testomatio/app/issues/1317</a></li>
<li>Improved user management for Manager role <a href="https://github.com/testomatio/app/issues/1322">https://github.com/testomatio/app/issues/1322</a></li>
<li>Added more visibility to AI features with clearer UI</li>
</ul>
<div style="text-align:center;"> &#x1F680; Create a <a href="https://app.testomat.io">demo project for free</a> and check all amazing features right now. We look forward to <a href="https://testomat.nolt.io">hearing what you think</a>, and what other features we should do!</div><!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Enhanced Manual Execution, Metadata Tracking & Suite Navigation]]></title><description><![CDATA[New updates include suite tab separation, metadata in manual runs, configurable views, unified label UI, and more control over manual test execution.]]></description><link>https://changelog.testomat.io/enhanced-manual-execution-metadata-tracking-suite-navigation/</link><guid isPermaLink="false">6808900869fbc243692224f8</guid><category><![CDATA[cloud]]></category><category><![CDATA[on-premise]]></category><dc:creator><![CDATA[Testomatio Team ]]></dc:creator><pubDate>Fri, 09 May 2025 06:15:33 GMT</pubDate><media:content url="https://changelog.testomat.io/content/images/2025/05/preview-5-40-1.png" medium="image"/><content:encoded><![CDATA[<!--kg-card-begin: markdown--><h2 id="meta-information-support-in-manual-test-execution">Meta Information Support in Manual Test Execution</h2>
<img src="https://changelog.testomat.io/content/images/2025/05/preview-5-40-1.png" alt="Enhanced Manual Execution, Metadata Tracking &amp; Suite Navigation"><p>You can now <strong>add meta information</strong> during manual test runs using <strong>key-value pairs</strong>, enabling teams to capture execution context more precisely. Once the run is completed, this metadata appears in the <strong>Meta</strong> panel  of the test result detail view. Example metadata you might include: Browser, OS, Release, Build, Network, Device, Screen Resolution, Special Conditions.</p>
<p>This enhancement supports better <strong>test documentation</strong>, <strong>result filtering</strong>, and <strong>post-execution analysis</strong>.</p>
<p><img src="https://changelog.testomat.io/content/images/2025/05/metadata.gif" alt="Enhanced Manual Execution, Metadata Tracking &amp; Suite Navigation" loading="lazy"></p>
<h2 id="split-suite-view-into-description-and-tests-tabs">Split Suite View into Description and Tests Tabs</h2>
<p>To improve usability, we&#x2019;ve separated the suite detail view into two distinct tabs: <strong>Description</strong> and <strong>Tests</strong>. This change helps users better manage suites that contain extensive documentation or test cases.</p>
<ul>
<li>The <strong>Description</strong> tab displays the suite&#x2019;s documentation and context.</li>
<li>The <strong>Tests</strong> tab now exclusively lists the associated tests.</li>
</ul>
<p>This separation makes it easier to navigate and edit large or content-heavy suites without losing focus.</p>
<p><img src="https://changelog.testomat.io/content/images/2025/05/split-suite-view.gif" alt="Enhanced Manual Execution, Metadata Tracking &amp; Suite Navigation" loading="lazy"></p>
<h2 id="configurable-view-for-manual-test-execution">Configurable View for Manual Test Execution</h2>
<p>We&#x2019;ve enhanced the manual test execution experience by adding the ability to <strong>customize the test list view</strong>. Users can now choose to <strong>show or hide labels and tags</strong>, making it easier to focus on the information that matters most during execution.</p>
<p>This flexibility helps reduce visual clutter or highlight specific metadata depending on your testing workflow. Simply toggle visibility options directly in the interface to tailor the view to your preferences.</p>
<p><img src="https://changelog.testomat.io/content/images/2025/05/show-hide-labels-tags.gif" alt="Enhanced Manual Execution, Metadata Tracking &amp; Suite Navigation" loading="lazy"></p>
<h2 id="unified-sidebar-for-labels-and-custom-fields-management">Unified Sidebar for Labels and Custom Fields Management</h2>
<p>We&#x2019;ve updated the <strong>labels assignment interface</strong> by introducing a consistent <strong>sidebar view across all pages</strong>. Whether you&#x2019;re managing labels or custom fields, you&#x2019;ll now experience the same streamlined and intuitive interface throughout the platform.</p>
<p>This change ensures a more <strong>cohesive user experience</strong> and simplifies bulk updates and quick edits across plans, steps, runs, or test cases.</p>
<p><img src="https://changelog.testomat.io/content/images/2025/05/labels-detail-view.gif" alt="Enhanced Manual Execution, Metadata Tracking &amp; Suite Navigation" loading="lazy"></p>
<h2 id="fixes-and-improvements-%F0%9F%9B%A0%EF%B8%8F">Fixes and Improvements &#x1F6E0;&#xFE0F;</h2>
<ul>
<li>Added ability to assign labels from Run Report view</li>
<li>Fixed error when trying to open test from Clusterize Errors <a href="https://github.com/testomatio/app/issues/1321">https://github.com/testomatio/app/issues/1321</a></li>
<li>Improved counters updating for Run Groups</li>
</ul>
<div style="text-align:center;"> &#x1F680; Create a <a href="https://app.testomat.io">demo project for free</a> and check all amazing features right now. We look forward to <a href="https://testomat.nolt.io">hearing what you think</a>, and what other features we should do!</div>
<!--kg-card-end: markdown--><p></p><p></p><p></p>]]></content:encoded></item><item><title><![CDATA[AI-Driven Requirements, Enhanced Execution, Reporting & Collaboration]]></title><description><![CDATA[Introducing AI-powered requirements, BDD drafts, advanced test relaunch, in-run test creation, PDF reports, and expanded attachments in Testomat.io]]></description><link>https://changelog.testomat.io/ai-driven-requirements-enhanced-execution-reporting-collaboration/</link><guid isPermaLink="false">67e419e8871db743329d4138</guid><category><![CDATA[cloud]]></category><category><![CDATA[on-premise]]></category><dc:creator><![CDATA[Testomatio Team ]]></dc:creator><pubDate>Fri, 25 Apr 2025 20:14:06 GMT</pubDate><media:content url="https://changelog.testomat.io/content/images/2025/04/preview-ai.png" medium="image"/><content:encoded><![CDATA[<!--kg-card-begin: markdown--><h2 id="ai-powered-requirements">AI-Powered Requirements</h2>
<img src="https://changelog.testomat.io/content/images/2025/04/preview-ai.png" alt="AI-Driven Requirements, Enhanced Execution, Reporting &amp; Collaboration"><p>We&#x2019;ve introduced <strong>AI-powered requirements analysis</strong> to streamline test coverage alignment with product requirements. When you link a Jira issue as a requirement in Testomat.io, the system will analyze the issue&#x2019;s description and automatically suggest a structured requirement description. Based on the analysis, you&#x2019;ll be offered two intelligent options:</p>
<ul>
<li><strong>Generate a new test suite with suggested test cases</strong></li>
<li><strong>Analyze existing suites and suggest test coverage improvements</strong></li>
</ul>
<p>This feature is available both at the <strong>project level</strong> and within individual <strong>test suites</strong>, enabling flexible, requirements-driven testing whether you&#x2019;re planning at a high level or working in a focused domain.</p>
<p><strong>Key benefits:</strong></p>
<ul>
<li>Automates the transition from requirements to test cases</li>
<li>Ensures traceability and alignment between business goals and test coverage</li>
<li>Reduces manual effort and potential gaps in test planning</li>
</ul>
<p><img src="https://changelog.testomat.io/content/images/2025/04/requirements.gif" alt="AI-Driven Requirements, Enhanced Execution, Reporting &amp; Collaboration" loading="lazy"></p>
<h2 id="ai-suggested-bug-description">AI-Suggested Bug Description</h2>
<p>To accelerate the defect reporting process, we&#x2019;ve introduced <strong>AI-generated bug descriptions</strong>. When executing tests and creating a new defect, Testomat.io will automatically suggest a concise, context-aware bug title and a description. These suggestions are based on the test case content and its execution results, helping teams report issues faster and more consistently.</p>
<p><strong>Why is this useful</strong></p>
<ul>
<li><strong>Speeding up defect logging:</strong> Testers can instantly use or refine AI-suggested bug details, reducing time spent writing repetitive or obvious issue reports.</li>
<li><strong>Maintaining consistent bug reporting standards:</strong> The AI helps standardize descriptions across team members, which improves clarity and communication with developers.</li>
<li><strong>Assisting less experienced testers:</strong> Junior team members or non-technical testers can rely on AI-generated suggestions as a starting point, ensuring important details aren&#x2019;t missed.</li>
</ul>
<p><img src="https://changelog.testomat.io/content/images/2025/04/ai-suggested-bug-desc.gif" alt="AI-Driven Requirements, Enhanced Execution, Reporting &amp; Collaboration" loading="lazy"></p>
<h2 id="draft-support-for-bdd-scenarios">Draft Support for BDD Scenarios</h2>
<p>We&#x2019;ve added the ability to save <strong>BDD Features</strong> and <strong>Scenarios</strong> as drafts, even when they contain syntax errors or typos. This feature was requested by users who needed more flexibility when writing or editing under time pressure. Previously, Testomat.io enforced strict BDD syntax validation, preventing users from saving incomplete or incorrect scenarios &#x2014; now, you can save your work and return later to fix it.</p>
<p><strong>Use cases:</strong></p>
<ul>
<li>You&apos;re in the middle of writing a scenario but have to leave for a meeting &#x2014; save your progress without losing your work.</li>
<li>A teammate wants to jot down a test idea quickly but doesn&#x2019;t have time to ensure the syntax is correct.</li>
<li>You&apos;re experimenting with scenario structure and want to save a rough version before finalizing it later.</li>
</ul>
<p><strong>Please note:</strong> Only one draft is allowed per BDD Feature/Scenario  at a time.</p>
<p><img src="https://changelog.testomat.io/content/images/2025/04/BDD-drafts.gif" alt="AI-Driven Requirements, Enhanced Execution, Reporting &amp; Collaboration" loading="lazy"></p>
<h2 id="advanced-relaunch-options-for-automated-test-runs">Advanced Relaunch Options for Automated Test Runs</h2>
<p>Based on user feedback, we&#x2019;ve expanded our CI integrations to allow more flexibility when relaunching tests. You can now <strong>select specific tests to relaunch</strong>, instead of repeating the full test run. This allows for more targeted, efficient workflows &#x2014; whether you&#x2019;re rerunning failed tests or just need to retest a subset of scenarios.</p>
<p><strong>Key benefits:</strong></p>
<ul>
<li>Customize relaunches to better fit your team&apos;s workflow</li>
<li>Simplify recovery after CI failures or flaky test results</li>
<li>Avoid redundant test executions and manual reconfiguration</li>
</ul>
<p><strong>How it works:</strong></p>
<ol>
<li>Open a completed automated test run</li>
<li>Click the <strong>Extra menu</strong> button</li>
<li>Select <strong>Advanced relaunch</strong></li>
<li>In the sidebar, configure your relaunch (optional):
<ul>
<li>Enter a custom run title</li>
<li>Choose to create a new run (if needed)</li>
<li>Select specific tests to include</li>
</ul>
</li>
<li>Click <strong>Relaunch</strong></li>
</ol>
<p><img src="https://changelog.testomat.io/content/images/2025/04/Advanced-relaunch.gif" alt="AI-Driven Requirements, Enhanced Execution, Reporting &amp; Collaboration" loading="lazy"></p>
<h2 id="expanded-attachments-capabilities">Expanded Attachments Capabilities</h2>
<p>We&#x2019;ve enhanced the ability to manage attachments across Testomat.io by allowing users to add attachments to <strong>suites</strong>, <strong>folders</strong>, and the <strong>readme section</strong>. This improvement helps streamline workflows by keeping all relevant files and documentation in one place. Whether you&#x2019;re sharing important notes, reference materials, or test data, you can now attach them directly to the relevant test structures for easy access.</p>
<p><strong>Use cases:</strong></p>
<ul>
<li><strong>Suite-level attachments:</strong> Attach detailed test execution reports or configuration files to specific test suites for easy reference by team members.</li>
<li><strong>Folder-level attachments:</strong> Add relevant project documentation or setup instructions to test folders, ensuring that all files related to a specific testing area are easily accessible.</li>
<li><strong>Readme section attachments:</strong> Include critical resources or additional context in the readme section, such as diagrams, code samples, or links to external resources, improving overall clarity for team members.</li>
</ul>
<h2 id="test-creation-during-manual-run-execution">Test Creation During Manual Run Execution</h2>
<p>We&#x2019;ve added the ability to <strong>create new tests directly during a manual test run</strong>. This streamlines the process of capturing missing or newly discovered scenarios on the fly, without interrupting your testing flow. When a test is added this way, it is automatically saved to the appropriate suite in the test repository and immediately included in the current test plan.</p>
<p>If needed, this functionality can be disabled by selecting <strong>Hide Test Creation</strong> from the extra options menu above the test list.</p>
<p><strong>Use cases:</strong></p>
<ul>
<li><strong>Identifying gaps in test coverage during execution:</strong> While performing manual testing, a tester discovers an untested scenario. They can instantly add it to the run and repository without switching context.</li>
<li><strong>Capturing exploratory test cases:</strong> Testers conducting exploratory testing can log and formalize new test cases as they go, ensuring useful insights are not lost.</li>
<li><strong>Collaborative test planning in real time:</strong> During team sessions or test reviews, testers can collectively identify and add new tests based on discussion or observed issues.</li>
</ul>
<p><img src="https://changelog.testomat.io/content/images/2025/04/test-creation-from-run.gif" alt="AI-Driven Requirements, Enhanced Execution, Reporting &amp; Collaboration" loading="lazy"></p>
<h2 id="export-run-reports-to-pdf">Export Run Reports to PDF</h2>
<p>You can now <strong>export your test run results as a PDF report</strong>. This feature enables easy sharing and archiving of test outcomes outside of Testomat.io. After completing a run, simply open the <strong>extra options</strong> menu and select <strong>Export to PDF</strong> to generate a structured report with test execution details.</p>
<p><strong>Use cases:</strong></p>
<ul>
<li><strong>Sharing results with stakeholders:</strong> Quickly generate a portable summary of test outcomes for product owners, QA managers, or external partners who may not have access to the platform.</li>
<li><strong>Audit and compliance documentation:</strong> Maintain a formal record of test results in a fixed format for compliance, certification, or audit purposes.</li>
<li><strong>Team retrospectives and reviews:</strong> Use the PDF report to review test coverage and outcomes during retrospectives or sprint reviews without relying on live access to the system.</li>
</ul>
<p><img src="https://changelog.testomat.io/content/images/2025/04/pdf-report.png" alt="AI-Driven Requirements, Enhanced Execution, Reporting &amp; Collaboration" loading="lazy"></p>
<h2 id="fixes-and-improvements-%F0%9F%9B%A0%EF%B8%8F">Fixes and Improvements &#x1F6E0;&#xFE0F;</h2>
<ul>
<li>Fixed issue links loss after runs merge <a href="https://github.com/testomatio/app/issues/1299">https://github.com/testomatio/app/issues/1299</a></li>
<li>Improved Jira links UI for bigger summary</li>
<li>Fixed multiselection for Runs table view <a href="https://github.com/testomatio/app/issues/1243">https://github.com/testomatio/app/issues/1243</a></li>
<li>Improved handling NUnit tests <a href="https://github.com/testomatio/app/issues/1283">https://github.com/testomatio/app/issues/1283</a>, <a href="https://github.com/testomatio/app/issues/1279">https://github.com/testomatio/app/issues/1279</a>, <a href="https://github.com/testomatio/app/issues/1280">https://github.com/testomatio/app/issues/1280</a></li>
<li>Added Runs deletion and purge tracking in Pulse</li>
<li>Added test parameters to Run Reports <a href="https://github.com/testomatio/app/issues/1262">https://github.com/testomatio/app/issues/1262</a></li>
<li>Fixed redirecting public Report link from Slack  to login page instead of the run report <a href="https://github.com/testomatio/app/issues/1300">https://github.com/testomatio/app/issues/1300</a></li>
<li>Improved performance of adding/removing tags <a href="https://github.com/testomatio/app/issues/1247">https://github.com/testomatio/app/issues/1247</a></li>
<li>Improved Jira links UI by showing status and summary of linked defect</li>
</ul>
<div style="text-align:center;"> &#x1F680; Create a <a href="https://app.testomat.io">demo project for free</a> and check all amazing features right now. We look forward to <a href="https://testomat.nolt.io">hearing what you think</a>, and what other features we should do!</div><!--kg-card-end: markdown-->]]></content:encoded></item></channel></rss>