Pieces Copilot is your Development Superpower 🦸: Pieces for Developers 2.9.0 & Pieces OS 7.1.0
Our final feature release of 2023 is devoted to making your Pieces Copilot experience the absolute best experience possible, and it is LOADED!
This release is layered with improvements big and small to increase the time you save and decrease the context-switching you do throughout your development cycle. We’re excited for you to take advantage of these updates as we close out the year, and we can’t wait to show you all what we have in store for 2024!
We want to make Pieces Copilot your one-stop shop for code generation and accurate answers to coding questions. As fine-tuned LLMs become more prevalent, we want to make sure that you can use any custom LLM to power Pieces Copilot, in addition to popular LLMs like GPT 4, PaLM 2 or Llama2.
In this release, we’re thrilled to introduce the first phase of our OpenAI integration where you can now add your own OpenAI API Key in the Pieces Desktop app to power the Pieces Copilot!
To access this feature, open the Pieces Copilot Runtime manager, navigate to the Cloud OpenAI models, and select “Register Custom API Key.” From here, you can add your own API key and start chatting with the Pieces Copilot powered with your LLM.
With this first version, you can add your personal or enterprise API key from OpenAI, allowing you to use your own quota and rate limits. In our next release, the platform will accommodate various “Bring your own model” scenarios, with a significant emphasis on supporting those who have fine-tuned GPT models based on their organizational or team-specific requirements.
Our team layered in tons of tweaks to the Pieces Copilot to make it the absolute best experience available for developers. In addition to some awesome behind-the-scenes improvements, we added several particularly notable improvements that you can use today.
Streamline your Copilot Context Configuration with snippet previews! Now, when choosing conversation context, get a quick look at a snippet's content. This feature empowers you to make more informed decisions about which snippets to include, enhancing the precision of your Copilot interactions.
Calling all Windows users! Experience the convenience of pasting images directly into the Copilot input field to extract code or text from images and start a conversation. Witness the magic as code or text is seamlessly extracted. This feature is designed to elevate your coding experience, making it smoother and more intuitive.
This release is packed with many more quality-of-life improvements to make your experience with the Pieces Copilot everything you’d expect and more.
As we’ve mentioned, shipping Local Large Language Models (LLLM) is one of our proudest moments of the year. It’s a huge win for privacy and security to have on-device LLMs that power Pieces Copilot. To make the LLLM experience better, we added real-time download percentages so you know how close your model is to being ready, as well as the ability to remove downloaded LLLMs from your machine. To remove a downloaded LLLM, just hover over the model on the runtime selection dialog and hit the delete button.
Some other notable improvements include, but aren’t limited to:
We know many of your conversations include lots of generated code snippets, making it harder to reference previous parts of your conversation. To give you more control over your copilot UI, we added the ability to collapse code blocks to make scrolling through a conversation a little bit easier.
Faster on-device LLM download speeds and better snippet context handling are just a few of the many performance upgrades in this release. You'll notice all of your tags, related links, annotations, anchors, and related people will feel far more stable, and load significantly faster than previous versions.
As always, this release is chock-full of bug fixes. A few users reported inconsistent issues generating shareable links and linking their Google & GitHub accounts.
Now, both of these issues have been eliminated! Major shout out to the community for sharing these issues with us and helping us get those solved.
Last, but not least, and most notably, we’ve finally solved an issue on certain Windows machines where the Pieces Desktop App was throwing a Package Support Framework error when trying to launch the application.
After hunting tirelessly for this issue, the team is proud to have this one solved and patched in the latest release.
Did you know Pieces has recently broken into the Open Source community? We recently launched support for our TypeScript SDK on NPM where developers around the world have started to build on-top of the Pieces Platform and get familiar with our API’s.
With our SDK’s, you can build your own apps, extend Pieces functionality, and so much more. Check out our GitHub to learn more about our Open Source initiatives and how you can start contributing today!
If TypeScript isn’t your primary language, no problem 🙂 Soon we’ll have support for Python, Kotlin, Dart, and more, so stay tuned for future updates.
Do you love Pieces? Stop sending us carrier pigeons 🐦 and join our Discord Server to chat with our team, other power users, get support, and more. 🤝
As always, if you run into issues or have feedback, please fill out this quick form or email us at firstname.lastname@example.org and we’ll be in touch as soon as possible!