Today’s release brings helpful new capabilities to chatting with Pieces Copilot in the Pieces Desktop app. With new persisted chats, you can revisit a conversation with Pieces Copilot later, launch a new conversation without deleting the old one, and more.
But, that’s not all in this release! We’ve made it faster to download Local LLMs, added additional context to the Pieces Copilot, crushed bugs, and more. Check it all out ⬇️
In this release, we've completely revamped the Pieces Copilot Experience! The new version offers a fresh, enhanced user interface and conversations that are now automatically persisted by default, allowing you to access your full list of previous conversations sorted by day and time. You can create and manage as many conversations as you desire, even across different Pieces products and plugins.
Customize your list by pinning or renaming conversations, and tailor each conversation with conversation-specific context such as local files and folders, saved snippets, snippets from websites, and messages within the conversation.
We've also introduced automatic annotations for each conversation, enabling you to view a brief summary without reading through the message history.
This release is packed with new features and enhancements to make your experience with the Pieces Copilot even more productive, contextual, and user-friendly.
To create a new chat with the Pieces Copilot, just hit “New Chat” in the upper left corner of the copilot screen. Pieces will automatically generate a title for your chats, but you can edit it at any time by clicking the chat’s title.
You can easily pin a chat to the top of your list, delete it or generate a summary of it by choosing an option from the three dots on the right side of a chat’s title.
You can save snippets generated by Pieces Copilot by hitting the Save to Pieces button under a code block, but you can also save all of the snippets in a chat by hitting the three dots to the right of the chat title. As always, all of the snippets that you save retain valuable context and metadata that you can view at any time in the Pieces Desktop App.
Once you install this update, your past conversations with the Pieces Copilot will be present as persistent chats, so you can revisit them. You can maintain as many conversations with the copilot as you like, so don’t be afraid to ask a new question!
Finally, with persistent chats, the Pieces Suite is even more interconnected. You can view chats with the Pieces Copilot that were initiated in our other products, including Pieces for VS Code and Pieces for Obsidian, in the desktop app and our extensions. No need to return to your browser to view a conversation— it’s all accessible in the desktop app, your IDE, or wherever you’re working.
Air-gapped, local copilot chats? Say less! This is one of our favorite features of the Pieces Copilot; we love being able to leverage the power of an LLM while keeping all of our code securely on our machine. However, downloading local LLMs used to be slow— no longer. We’ve decreased the amount of time it takes to download a local LLM by more than 50% so that you can start chatting more quickly.
To download a local LLM and begin chatting with Pieces Copilot, select the copilot runtime picker in the lower left corner of the Pieces Desktop App. From this menu, you can download the model that works best for you.
Last but certainly not least, we’ve crushed a whole bunch of bugs and made some useful adjustments to our user interfaces in order to upgrade your user experience. This includes a major update to the Pieces Copilot empty state, fixing small bugs in Pieces OS, and adding a link to our documentation from the Pieces OS menu.
Do you love Pieces? Stop sending us carrier pigeons 🐦 and join our Discord Server to chat with our team, other power users, get support, and more. 🤝
As always, if you run into issues or have feedback, please fill out this quick form or email us at firstname.lastname@example.org and we’ll be in touch as soon as possible!