Use a Proxy in PHP

How to Use a Proxy in PHP (2025)

In this guide, I’ll show you how to set up a proxy in PHP with cURL. I’ll also cover how to handle proxy authentication, rotate proxies for more security, and use reliable premium proxies like Bright Data. So, let’s dive in and make your web scraping smoother and more anonymous!

Setting Up a Proxy in PHP

First, let’s see how to set up a basic proxy using PHP’s cURL library. For this, you can get a free proxy from a public proxy list. Keep in mind that free proxies tend to be less reliable and may stop working at any time.

Step 1: Create a Scraper Function

To get started, create a PHP script where you’ll set up the cURL request and configure the proxy. Here’s an example of how you can do this:

function scraper() {
// Initialize cURL
$curl = curl_init();
// Set the target URL
curl_setopt($curl, CURLOPT_URL, "https://httpbin.org/ip");
// Set the proxy address
curl_setopt($curl, CURLOPT_PROXY, "http://50.223.246.237:80");
// Set options to receive the response as a string
curl_setopt($curl, CURLOPT_RETURNTRANSFER, true);
// Follow redirects
curl_setopt($curl, CURLOPT_FOLLOWLOCATION, true);
// Execute the cURL request
$html = curl_exec($curl);
// Display the response
echo $html;
// Catch errors if any
if (curl_errno($curl)) {
echo 'cURL error: ' . curl_error($curl);
}
// Close the cURL session
curl_close($curl);
}
// Run the function
scraper();
?>

Output

The output will show the proxy IP being used to make the request:

{
"origin": "50.223.246.237"
}

If you get an error like “CONNECT tunnel failed,” it’s because free proxies can be unreliable. In that case, simply use a different one from your list.

Proxy Authentication in PHP

Many paid proxy services, require authentication before you can use their proxies. To authenticate, you need to include your proxy username and password in the cURL request.

Here’s an example of how to authenticate your proxy in PHP:

function scraper() {
// Initialize cURL
$curl = curl_init();
// Set the target URL
curl_setopt($curl, CURLOPT_URL, "https://httpbin.org/ip");
// Set the proxy address
curl_setopt($curl, CURLOPT_PROXY, "http://54.37.214.253:8080");
// Provide the proxy credentials
curl_setopt($curl, CURLOPT_PROXYUSERPWD, "username:password");
// Set options to return the response as a string
curl_setopt($curl, CURLOPT_RETURNTRANSFER, true);
// Follow redirects
curl_setopt($curl, CURLOPT_FOLLOWLOCATION, true);
// Execute the cURL request
$html = curl_exec($curl);
// Display the response
echo $html;
// Catch errors if any
if (curl_errno($curl)) {
echo 'cURL error: ' . curl_error($curl);
}
// Close the cURL session
curl_close($curl);
}
// Run the function
scraper();
?>

If you enter the wrong username or password, you’ll get a 407 error. Double-check your credentials and try again.

Rotating Proxies in PHP

To avoid detection and blockages, rotating proxies is a great method. This technique uses multiple proxies, rotating them with each request. This makes each request appear to come from a different location, which helps you avoid IP bans and rate limiting.

Here’s how to implement proxy rotation in PHP:

// Proxy rotation function
function proxyRotator($proxy_list) {
// Select a random proxy from the list
return $proxy_list[array_rand($proxy_list)];
}
function scraper() {
// List of proxies
$proxies = [
'http://203.115.101.51:82',
'http://50.207.199.82:80',
'http://188.68.52.244:80',
];
// Get a random proxy from the list
$proxy = proxyRotator($proxies);
// Initialize cURL
$curl = curl_init();
// Set the target URL
curl_setopt($curl, CURLOPT_URL, "https://httpbin.org/ip");
// Set the proxy address
curl_setopt($curl, CURLOPT_PROXY, $proxy);
// Set options to return the response as a string
curl_setopt($curl, CURLOPT_RETURNTRANSFER, true);
// Follow redirects
curl_setopt($curl, CURLOPT_FOLLOWLOCATION, true);
// Execute the cURL request
$html = curl_exec($curl);
// Display the response
echo $html;
// Catch errors if any
if (curl_errno($curl)) {
echo 'cURL error: ' . curl_error($curl);
}
// Close the cURL session
curl_close($curl);
}
// Run the function
scraper();
?>

The output will show different IP addresses for each request, confirming that the proxy rotation works:

{
"origin": "188.68.52.244"
}

While free proxies are great for testing, they have a high failure rate and aren’t suitable for large-scale operations. For reliable scraping, it’s best to use premium proxies.

Using Premium Proxies for Reliability

Premium proxies, like those from Bright Data or Smartproxy, offer highly reliable residential IPs. These proxies are less likely to be flagged, as they are tied to real users’ devices. Bright Data and Smartproxy also provide features like geo-location and proxy rotation, which help you bypass geo-restrictions and avoid detection.

Here’s an example of how to integrate Bright Data’s residential proxies with PHP:

function scraper() {
// Initialize cURL
$curl = curl_init();
// Set the target URL
curl_setopt($curl, CURLOPT_URL, "https://httpbin.org/ip");
// Set the proxy address (Bright Data)
curl_setopt($curl, CURLOPT_PROXY, "http://your_proxy_address");
// Provide your Bright Data credentials
curl_setopt($curl, CURLOPT_PROXYUSERPWD, "your_username:your_password");
// Set options to return the response as a string
curl_setopt($curl, CURLOPT_RETURNTRANSFER, true);
// Follow redirects
curl_setopt($curl, CURLOPT_FOLLOWLOCATION, true);
// Execute the cURL request
$html = curl_exec($curl);
// Display the response
echo $html;
// Catch errors if any
if (curl_errno($curl)) {
echo 'cURL error: ' . curl_error($curl);
}
// Close the cURL session
curl_close($curl);
}
// Run the function
scraper();
?>

Best Practices for Using Proxies

To ensure your scraper works efficiently and stays undetected, keep the following best practices in mind:

  • Rotate IPs: Use rotating proxies to avoid detection. If you’re scraping large amounts of data, this is crucial.
  • Use Proper Headers: Set request headers like User-Agent to make requests appear more like regular web traffic.
  • Control Request Speed: Sending too many requests too quickly can lead to detection. Implement delays between requests or use exponential backoff for retries.
  • Be Ethical: Follow the rules in the site’s robots.txt file, and avoid scraping data that violates the site’s terms of service.

Conclusion

Setting up proxies in PHP is easy and can help you avoid IP bans when scraping data. Whether you are using free proxies for testing or premium ones like Bright Data for better reliability, proxies are a must for any serious web scraper. By rotating proxies, handling authentication, and following best practices, you’ll be able to scrape websites safely and efficiently.

Similar Posts