Python version of ssh -D (SOCKS proxy over SSH) -


i'm trying use urllib2 on proxy scrap web page isn't directly available (it's running in remote server's local network , isn't externally accessible). proxy i'd prefer ssh socks proxy (like if run ssh -d 9090 server), both because have access , because it's secure.

i've had poke around paramiko find points running ssh connection out on socks, opposite of i'm trying accomplish here.

i have seen transport class dumb forwarding , doesn't provide nice openssh-socks proxy interface can latch onto socksipy (et al).

net::ssh::socks ruby i'm looking in wrong language. there available in python provides proxy on ssh?

i have workaround works scraping. instead of trying use ssh connection, i'm using remote shell pull out data:

from bs4 import beautifulsoup import paramiko  ssh = paramiko.sshclient() ssh.load_system_host_keys() ssh.connect('example.com', username='oli', look_for_keys=true, timeout=5)  stdin, stdout, stderr = ssh.exec_command('/usr/bin/wget -qo- "%s"' % url) soup = beautifulsoup(stdout)  ssh.close() 

this isn't looking begin (and i'd still see if there's way of connecting socks socket in on ssh) there elegance in simplicity.


Comments

Popular posts from this blog

image - ClassNotFoundException when add a prebuilt apk into system.img in android -

I need to import mysql 5.1 to 5.5? -

Java, Hibernate, MySQL - store UTC date-time -