Python version of ssh -D (SOCKS proxy over SSH) -
i'm trying use urllib2 on proxy scrap web page isn't directly available (it's running in remote server's local network , isn't externally accessible). proxy i'd prefer ssh socks proxy (like if run ssh -d 9090 server ), both because have access , because it's secure. i've had poke around paramiko find points running ssh connection out on socks, opposite of i'm trying accomplish here. i have seen transport class dumb forwarding , doesn't provide nice openssh-socks proxy interface can latch onto socksipy (et al). net::ssh::socks ruby i'm looking in wrong language. there available in python provides proxy on ssh? i have workaround works scraping. instead of trying use ssh connection, i'm using remote shell pull out data: from bs4 import beautifulsoup import paramiko ssh = paramiko.sshclient() ssh.load_system_host_keys() ssh.connect('example.com', username='oli', look_for_keys=true, timeout=5) stdin, st...