What type of API does LiveChat provide?
REST (Representational State Transfer) is one of the most popular API architectures:
- It uses standard HTTP methods like GET, POST, PUT, DELETE
- Typically returns data in JSON format
- Has multiple endpoints representing different resources
- Is stateless and cacheable
- Is widely used for web services and mobile applications
GraphQL is a more modern API query language and runtime:
- It has a single endpoint where clients can request exactly the data they need
- Allows fetching multiple resources in a single request
- Provides a strongly-typed schema
- Gives clients more control over the data they receive
- Is good for complex data requirements and reducing over-fetching
SOAP (Simple Object Access Protocol) is an older protocol:
- Uses XML for message format
- Has stricter standards and built-in error handling
- Is more complex to implement than REST
- Provides features like security and transaction management
- Still used in some enterprise and legacy systems
Does the LiveChat API have webhooks?
Yes, the LiveChat API has webhooks available.
Types of Events
LiveChat webhooks allow you to subscribe to various types of events, including:
- incoming_chat - Notifies when a new chat starts
- tag_created - Notifies when a new tag is created
- incoming_event - Signals when each new event occurs in a chat
- chat_deactivated - Notifies when a chat ends
How Webhooks Work
- Webhooks are HTTP POST requests sent by LiveChat to a URL you specify when an event occurs
- They allow automated information exchange between LiveChat and other services
- You need to register webhooks for your application to receive them
Configuring Webhooks
There are two main ways to configure webhooks:
-
Via the Developer Console:
- Use the "Chat webhooks" building block to set up webhooks
-
Via the Configuration API:
- Make API calls to register webhooks programmatically
Webhook Types
LiveChat has two types of webhooks:
- License webhooks - Registered per Client ID and enabled per license
- Bot webhooks - Associated with bot status and require additional setup steps
Best Practices
- Use a service like webhook.site for testing webhooks before implementation
- Respond with HTTP 200 when receiving webhooks to prevent retries
- Consider splitting webhook processing into two steps if handling takes longer than the 10-second timeout
Use Cases
Webhooks are useful for integrating LiveChat with external systems like CRMs, marketing automation tools, or analytics platforms.
Rate Limits and other limitations
The API Rate Limits for the LiveChat API are as follows:
Cloud Accounts
For cloud-hosted accounts, the API rate limit is set to 180 requests per minute, counted for each API key separately. This limit is in place to ensure the server operates smoothly and does not get overloaded.
Standalone Installations
For standalone installations, the same rate limit as cloud accounts (180 requests per minute) is applied by default. However, since you control your own server in standalone installations, it is possible to override and increase this limit if needed.
To increase the limit on standalone installations:
- Insert a row into the
qu_g_settings
table
- Set the name to "api_rate_limit"
- Set the value to the desired limit number
Key Points to Consider
- The rate limit is applied per API key, allowing for separate limits for different integrations or applications.
- For standalone installations, the ability to increase the limit provides flexibility for high-volume use cases.
- When implementing API calls, it's important to respect these limits and implement proper error handling for rate limit responses.
Best Practices
- Implement proper error handling for rate limit responses (usually HTTP 429 status codes).
- Consider caching frequently accessed data to reduce API calls.
- Use bulk operations where possible to minimize the number of individual API requests.
- For high-volume use cases on standalone installations, carefully consider the appropriate rate limit to balance performance and server load.
While these limits apply specifically to the LiveChat API, it's worth noting that rate limiting is a common practice across many APIs to ensure fair usage and system stability. Always consult the most up-to-date documentation for the latest information on rate limits and best practices for API usage.
Latest API Version
Based on the search results provided, the most recent version of the LiveChat API appears to be v3.5. Here are the key points:
Latest API Version
- The current stable version recommended for production use is Customer Chat Web API v3.5.
Key Points
- The API documentation and examples provided use the v3.5 endpoint (https://api.livechatinc.com/v3.5/).
- This version is described as "the latest stable version recommended for the production use".
Recent Changes
- There was a decommissioning of v3.2 in LiveChat APIs and SDKs at the end of March 2023.
- Users were encouraged to migrate to v3.3 or preferably to the current stable API version (v3.5) by 31-03-2023.
API Usage
- The API allows developers to automate workflows, create custom integrations, and access live chat data.
- It requires authentication using a Personal Access Token (PAT) which can be generated in the Developer Console.
Best Practices
- When using the API, it's recommended to always use the latest stable version for production environments.
- Developers should stay informed about API updates and deprecations to ensure their integrations remain functional.
- For any new development or updates to existing integrations, using v3.5 would be the best practice based on the information provided.
It's important to note that API versions can change over time, so developers should regularly check the official LiveChat API documentation for the most up-to-date information on API versions and features.
How to get a LiveChat developer account and API Keys?
- Sign up for a LiveChat Developer Console account
To create an API integration with LiveChat, you first need to sign up for a developer account in the LiveChat Developer Console. You can do this by visiting the LiveChat Developer Console signup page.
- Access Platform Developer Tools
Once you've signed up and logged in to the Developer Console, you'll have access to all the Platform Developer Tools needed to create integrations.
- Generate an API token
To authenticate your API requests, you'll need to generate a Personal Access Token (PAT):
- Go to the Developer Console
- Navigate to Tools > Personal Access Tokens
- Create a new PAT
- Implement authorization
If your integration needs to access user data, implement the appropriate authorization flow:
- For backend apps, consider using the server-side authorization code grant flow
- For frontend integrations, you can use the "Sign in with LiveChat" flow
What can you do with the LiveChat API?
Here's a list of data models you can interact with using the LiveChat API, along with what is possible for each:
Chats
- Retrieve chat transcripts
- Search and filter closed chats
- Access chat-related data like customer satisfaction, average length, etc.
- Get real-time information on current chat queues
Agents
- Change agent statuses
- Retrieve agent activity data
- Get real-time performance metrics for agents
Customers/Visitors
- Save visitor information
- Access visitor journey data
- Retrieve customer data during chats
Conversations (Messaging)
- Retrieve open and closed conversations
- Access conversation transcripts
- Search and filter conversations
Reporting
- Create custom reports by combining LiveChat data with other sources
- Access aggregated real-time data on contact center performance
- Retrieve historical data on agent activity, visitor journeys, etc.
Account/Skills
- Get account-level metrics and data
- Access skill-level performance information
Engagements
- Retrieve data on customer engagements
Surveys
- Access survey data and results
The LiveChat API allows you to interact with these core data models to automate workflows, enhance reporting, build custom integrations, and access both real-time and historical data across your LiveChat implementation. The API provides both raw data access as well as aggregated metrics for many of these models.