Often, we need to integrate custom code in our Webflow site. If we do it natively, any changes we make to our code would require a site wide publish to test on the staging/custom domain. JavaScript is not executed in the Designer. This can really slow down our workflow and hinder our collaboration efforts.
Separate code from Webflow by testing locally with a different script on the Webflow staging domain. This will allow us to see changes much faster without needing to re-publish in Webflow each time custom code is updated.
The script below allows us to differentiate the script-source between staging and production. All testing should happen on the webflow.io staging or branch staging domain. When ready, the production script can be updated for the changes to go live.
Since the staging and custom domain have the same output from Webflow, no additional work will be required once the production script is updated and live.
Updating the production script will vary between implementations, but most commonly it means pushing the latest version to GitHub which kicks off the deployment workflow.
let script = document.createElement("script");
if (window.location.hostname.indexOf("webflow.io") !== -1) {
// replace with your dev script URL
script.src = "https://url-to-your-dev-script.js";
} else {
// replace with your prod script URL
script.src = "https://url-to-your-prod-script.js";
}
document.body.appendChild(script);
Alternatively, no-code tools like Wized can also be used.
Using Visual Studio Code (VSC) or your code editor of choice, we can use the Live Server plugin or equivalent to run scripts locally. With this plugin, your script will be available on http://localhost:3000/ or a similar port. From there, you can navigate to the file depending on the path (i.e., http://localhost:3000/index.js).
Take the path of your file and update the first if-condition in the script above that checks whether or not your’re on the Webflow staging domain. Then go to the staging domain and reload the page each time your script has been updated.
If this is not an option, you can have two hosted scripts — one for staging and one for production. When making changes, publish the staging hosted script and test on the Webflow staging domain.
Often, we need to make third-party API calls. However, making them from localhost is not always supported due to CORS issues (depending on the service). In this case, use a tool or browser extension like Allow CORS: Access-Control-Allow-Origin to circumvent this scenario.
Additionally, some browsers might not like localhost i.e., Safari so ensure testing is done in a browser where this is not an issue.
After setting up the staging environment based on the instructions above, there isn’t much work left to have it work in production. Apart from your own deployment workflow, the only step would be to push the recent changes to your production script. Common solutions we see for hosting the script include Github pages, Cloudflare pages, or Jsdelivr.
Since all code in Webflow is considered client-side, we never want to include sensitive details in our code like API keys. If making requests to a service that requires the API key in the request, consider using an intermediary server or backend as service (i.e., Autocode, AWS Lambda, Cloudflare Workers, or your own server). This will ensure the API key is not exposed to the client. From this backend environment, you can make safe API calls. Then from Webflow, you would make an API call to your backend which would then make the API call to the protected service in a safe environment.
This approach is reflected in the following repos: Fetch api and write to dom and greenhouse jobs. The greenhouse jobs example demonstrates the usage of the cloneNode() api which skips the need of creating DOM elements in your code. Elements can be copied directly from the page i.e., elements built in Webflow.
Most APIs have a rate-limit. This means we may see failed responses from a third-party service if we experience high traffic. In this case, consider implementing throttling or the exponential backoff pattern. Depending on the implementation and environment (i.e., Webflow, Nodejs backend, etc), one or more of these methods may be more appropriate than the other.
Sometimes our code requires data to change and persist across pages as the user navigates our site. In this case, use the localStorage API or cookies to save and retrieve data between pages.
Moving data to the Webflow CMS is a common task. This process can be done by CSV import or through the Webflow API. The former option can be more labor intensive and time-consuming if working with a lot of data. The Webflow API allows you to automate this process. See the following example GitHub repo: Populate data in the Webflow CMS.
Webflow also has an npm package but it currently only supports v1 of the API.
Like most APIs, the Webflow API has a rate limit (varies by plan). It’s important to take this into account when sending data to Webflow in large chunks. Ensure calls to the Webflow API are handled appropriately in your code. A common npm library to help with this is Bottleneck.
The Webflow API also allows you to add webhooks to listen to certain events in Webflow. This is commonly used to send data to other platforms when a form is submitted, a CMS item is changed, and more. See the Webhook example repo for more information.
Note: Only the form submission and site publish webhooks can be added natively in project settings. All other webhooks must be added through the API.
Every collection and collection item in Webflow has a unique ID that helps identify it when interacting with the API. Use them to cross-reference relevant Webflow collections and items in your external app. In some cases, you may want to store an external ID in your Webflow CMS schema if applicable.
When the staging and production publish times are out of sync in Webflow, single item publishing from the API will not work. Both domains must be published together for single item publishing to work.
However, with the new branch staging feature, this is no longer an issue. Please advise your team to incorporate the branch staging feature into their workflow to minimize impact to this specific API call and integrations that rely on single item publishing by ensuring both the main staging domain and custom domains are published at the same time.
API tokens can have different scopes. When you generate a token, be sure to define the appropriate CRUD (create, read, update, and delete) access to relevant endpoints.
API v1 and v2 endpoints require their own distinct API tokens. It’s not possible to use a v1 token with v2 endpoints and vice versa.
You can build a custom App that will run in the Designer that is private to your team's Workspace. An App can interact with the Designer canvas and automate certain tasks. All areas that can be interacted with are mentioned in the Designer APIs.
Data Clients allow you to interact with backend services that you would normally interact with using an API token. However, you get access from a different perspective because you can perform these actions from within a site in the Designer on behalf of the user. For example, you can upload assets, create webhooks, read and write page data, and more. This is possible by minting an access token and using it to make API calls on behalf of the user.
A hybrid app is one that combines the App and Data Client aspects mentioned above.