What’s the great thing about building an app with a public blockchain as a backend? The availability of the data. It’s always there, it’s always running, and all you have to do is query it. What’s the bad thing about it? ...There’s a lot of data to be queried.

So much, in fact, that you might find your dapp’s performance bottlenecked, or at least impacted, by what you’re getting back from the JSON-RPC interface. Luckily for you, there are solutions, and they live on Infura.

What are filters?

Filters are JSON RPC methods that allow you specify the information you’re interested in coming from the network. The filter requests this information, and all information resulting from that filter since the time of the filter’s last request is returned. Filters are available over HTTPS.

You can create filters with the following API methods:

Poll for changes to the filters using the eth_getFilterChanges API method which returns an array of changes that occurred since the last poll.

Using filters offer several advantages: first of all, you control when they fire. This gives you the ability to customize when that call goes out within the flow of your dapp. And even if it’s been a while since your last filter request, you’ll get updated with everything that’s happened since your previous call. The downside to that feature is that the filter is maintaining state in the node, based on a filter ID. However, that ID is subject to garbage collection, and thus requires its own maintenance.

At the end of the day, a filter can be a very stable option for network data querying: after all, it’s a series of discrete requests, as opposed to a long-lived connection that may break or be interrupted.

What are subscriptions?

The other option, of course, is using the JSON-RPC pub/sub service. As opposed to filters, in which your dapp is actively requesting the information from the network, in a pub/sub scenario, your dapp is passively receiving the information as it is published. The data is essentially pushed from the node into your connection, once established. The pub/sub service is only available via WebSockets.

One thing to keep in mind about setting up your data querying through subscriptions is stability. If a websocket connection closes, the subscription is automatically closed and removed. Additionally, data received through subscriptions are stored in an internal buffer. If this buffer runs out of space, this will also cause the connection to be closed.

Another aspect to subscriptions is that they report current events. With filters, you got updated with “everything since the last time you called,” so to speak. That’s not the case with subscriptions, but this is a perfect example of choosing the method that best fits your use case.

Are you building a ticker reporting most recent events? Maybe a subscription is what you’re looking for.

Are you building an app that displays transaction histories? This might be the case for a filter.

Integrations with Infura

Whichever option fits your use case best, Infura is ready to help you query the network. If you’re looking for next steps or some examples, jump in with the documentation.

Examples

In the following examples we'll show how to watch for transfer events using both methods. We'll keep it at the HTTP JSON-RPC or WebSocket level without using any Web3 libraries.

This first example is a suboptimal way to process the latest block and get the transfer events from that block. For example, imagine this code in a loop, call eth_blockNumber to detect when we see a new block and do a subsequent eth_getLogs call based on this new block. This is prone to missing blocks depending on network latency, how accurately timed the loop is, and processing / P2P block catch ups the node may be doing internally.

const axios = require('axios');
const infuraUrl = "https://mainnet.infura.io/v3/<ProjectID>";
const payloadProto = {
jsonrpc: "2.0",
method: "",
params: [],
id: 1
};
(async () => {
// Let's get the latest block number
const blockPayload = Object.assign({}, payloadProto, { method: 'eth_blockNumber' })
const client = axios.create({
url: infuraUrl,
method: 'POST',
headers: { 'Content-Type': 'application/json'}
})
var resp = await axios({
url: infuraUrl,
method: 'POST',
data: blockPayload,
headers: { 'Content-Type': 'application/json' }
});
const blockNumber = resp.data
console.log(blockNumber)
// Get the Transfers events in this block
const logPayload = Object.assign({}, payloadProto, { method: 'eth_getLogs', params: [
{
fromBlock: Number(blockNumber),
toBlock: Number(blockNumber),
topics: ["0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef"]
}
]})
resp = await axios({
url: infuraUrl,
method: 'POST',
data: logPayload,
headers: { 'Content-Type': 'application/json' }
})
for (var it of resp.data.result) {
console.log(it)
}
})();

Let's improve the code by using the eth_newBlockFilter and eth_getFilterChanges APIs. The benefit is that we can poll the API for changes, and network conditions don't matter, we could even stop this script and start it minutes later.

As long as we use a consistent filter ID, it will return changes from the last time we polled

the server. Since the filter ID can get garbage collected during high node usage, extra care is required to ensure the filter still exists.  You'll need to recreate the filter if it no longer exists.

const axios = require('axios');
const infuraUrl = "https://mainnet.infura.io/v3/<ProjectID>";
const payloadProto = {
jsonrpc: "2.0",
method: "",
params: [],
id: 1
};
(async () => {
// Start by getting a filter ID
// We are interested in getting new blocks, so we create a newBlockFilter
const newFilterPayload = Object.assign({}, payloadProto, { method: 'eth_newBlockFilter' })
const filterIdResponse = await axios({
url: infuraUrl,
method: 'POST',
data: newFilterPayload,
headers: {'Content-Type': 'application/json'}
})
const filterId = filterIdResponse.data.result
console.log(`Using filterId: ${filterId}`)
const newBlockPayload = Object.assign({}, payloadProto, { method: "eth_getFilterChanges", params: [filterId]})
const hash = "";
while (true) {
var newBlockResponse = await axios({
url: infuraUrl,
method: 'POST',
data: newBlockPayload,
headers: {'Content-Type': 'application/json'}
})
var hashes = newBlockResponse.data.result
if (hashes.length === 0) { continue }
// Since we get blockHashes back with this, the getLogs call changes slightly
hashes.map(async (hash) => {
const logByHashPayload = Object.assign({}, payloadProto, {method: 'eth_getLogs', params: [{
blockHash: hash,
topics: ["0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef"]
}]})
const logResponse = await axios({
url: infuraUrl,
method: 'POST',
data: logByHashPayload,
headers: {'Content-Type': 'application/json'}
})
if (logResponse.data.result) {
console.log(`Found: ${logResponse.data.result.length} logs for Block: ${hash}`)
} else {
console.log(logResponse.data)
}
})
}
})();

We could further improve this by skipping the new blocks retrieval and simply create a new filter specifically on logs. This simplifies the polling:

const axios = require('axios');
const infuraUrl = "https://mainnet.infura.io/v3/<ProjectID>";
const payloadProto = {
jsonrpc: "2.0",
method: "",
params: [],
id: 1
};
(async () => {
// Like all filters start with getting a new Filter ID
// In this case we can just pass a the log params into the filter to specify the match.
// The node in this case will return the events associated with the `latest` block
const filterIdPayload = Object.assign({}, payloadProto, { method: 'eth_newFilter', params: [{
topics: ["0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef"]
}]})
const filterResponse = await axios({
url: infuraUrl,
method: 'POST',
data: filterIdPayload,
headers: {'Content-Type': 'application/json'}
})
const filterId = filterResponse.data.result;
const filterChangePayload = Object.assign({}, payloadProto, { method: 'eth_getFilterChanges', params:[ filterId ]})
while(true) {
const filterLogResponse = await axios({
url: infuraUrl,
method: 'POST',
data: filterChangePayload,
headers: {'Content-Type': 'application/json'}
})
console.log(filterLogResponse.data)
}
})();

Now let’s try to get the same data by subscribing for logs using a WebSocket connection. This method returns logs that are included in newly imported blocks.

Please note that the WebSocket connection will get closed by the server after one hour of being idle (for example, no data coming over). Ensure you add a reconnect routine and also unsubscribe when the subscription is no longer needed. When subscribing to relatively silent contracts, you could also subscribe to newHeads which prevents your WebSocket connection going into idle.

const WebSocket = require('ws')
const ws = new WebSocket('wss://mainnet.infura.io/ws/v3/projectID);
ws.on('open', function open() {
//ws.send('{"jsonrpc":"2.0", "id": 1, "method": "eth_subscribe", "params": ["newHeads"]}');
ws.send('{"jsonrpc":"2.0", "id": 1, "method": "eth_subscribe", "params": ["logs", {"topics": ["0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef"]}]}');
});
ws.on('message', function message(data) {
console.log('received: %s', data);
});