From srinivasaenergy at gmail.com Wed Apr 1 08:23:30 2015 From: srinivasaenergy at gmail.com (Srinivasa Rao) Date: Wed, 1 Apr 2015 11:53:30 +0530 Subject: [BangPypers] Collecting Trending Topics, Social Media sentiments from Twitter/Facebook In-Reply-To: References: Message-ID: Hi All, If any one interested into Java or Python or Testing or Angular JS please share your resume the work location will be around world and India as well minimum experience would be 2.5 years can apply . I our company . Regards, Srinivasa TechLead On Tue, Mar 31, 2015 at 3:15 PM, Jins Thomas wrote: > Dear Experts, > > Would like to know your thought on available Python modules/apis for > extracting information from Twitter and Facebook. > > > Requirement: An organization has members and committees all across > Karnataka, their members will tweet/post daily/weekly updates with a > particular hashtag. And Organization will extract this information and > create a report on monthly meetings happened at different districts, > different Programs they organized etc etc. > > Would be great to have some thoughts from people already worked on these > Twitter/Facebook apis. > > Thank you very much in anticpation. > > > Thanks > Jins Thomas > _______________________________________________ > BangPypers mailing list > BangPypers at python.org > https://mail.python.org/mailman/listinfo/bangpypers > From srinivasaenergy at gmail.com Wed Apr 1 08:24:31 2015 From: srinivasaenergy at gmail.com (Srinivasa Rao) Date: Wed, 1 Apr 2015 11:54:31 +0530 Subject: [BangPypers] Collecting Trending Topics, Social Media sentiments from Twitter/Facebook In-Reply-To: References: Message-ID: Please share with me your resumes . On Wed, Apr 1, 2015 at 11:53 AM, Srinivasa Rao wrote: > Hi All, > > If any one interested into Java or Python or Testing or Angular JS please > share your resume the work location will be around world and India as well > minimum experience would be 2.5 years can apply . > > I our company . > > Regards, > Srinivasa > TechLead > > On Tue, Mar 31, 2015 at 3:15 PM, Jins Thomas wrote: > >> Dear Experts, >> >> Would like to know your thought on available Python modules/apis for >> extracting information from Twitter and Facebook. >> >> >> Requirement: An organization has members and committees all across >> Karnataka, their members will tweet/post daily/weekly updates with a >> particular hashtag. And Organization will extract this information and >> create a report on monthly meetings happened at different districts, >> different Programs they organized etc etc. >> >> Would be great to have some thoughts from people already worked on these >> Twitter/Facebook apis. >> >> Thank you very much in anticpation. >> >> >> Thanks >> Jins Thomas >> _______________________________________________ >> BangPypers mailing list >> BangPypers at python.org >> https://mail.python.org/mailman/listinfo/bangpypers >> > > From dinakar at gridlex.com Fri Apr 3 08:50:22 2015 From: dinakar at gridlex.com (Dinakar K) Date: Fri, 3 Apr 2015 12:20:22 +0530 Subject: [BangPypers] Simple and profound python tricks Message-ID: Hi Guys, We built a useful page about simple and profound python tricks and got a very good response from reddit here is the link Simple and profound python tricks we would love to get your feedback on the page and the type of content you want us to add to the site please feel free to add your tricks From nimish.s.dalal at gmail.com Fri Apr 3 09:11:56 2015 From: nimish.s.dalal at gmail.com (Nimish Dalal) Date: Fri, 3 Apr 2015 12:41:56 +0530 Subject: [BangPypers] Simple and profound python tricks In-Reply-To: References: Message-ID: Excellent guys. This is definitely going to help n00bs like me who recently got started learning python. On Fri, Apr 3, 2015 at 12:20 PM, Dinakar K wrote: > Hi Guys, > > We built a useful page about simple and profound python tricks > and got a very good response from reddit > > here is the link Simple and profound python tricks > < > https://www.reddit.com/r/Python/comments/316dm5/simple_profound_python_tricks/ > > > > we would love to get your feedback on the page and the type of content you > want us to add to the site > > please feel free to add your tricks > _______________________________________________ > BangPypers mailing list > BangPypers at python.org > https://mail.python.org/mailman/listinfo/bangpypers > -- Nimish S. Dalal Cell: 9819670499 Facebook: http://www.facebook.com/nimish.s.dalal Twitter: http://twitter.com/nimishdalal Linkedin:http://in.linkedin.com/in/nimishsdll Url: http://www.nimishdalal.me Our generation has had no Great Depression, no Great War. Our war is a spiritual war. Our great depression is our lives. From anandpillai at letterboxes.org Fri Apr 3 11:11:14 2015 From: anandpillai at letterboxes.org (Anand B Pillai) Date: Fri, 03 Apr 2015 14:41:14 +0530 Subject: [BangPypers] Simple and profound python tricks In-Reply-To: References: Message-ID: <551E5932.6040005@letterboxes.org> -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 On Friday 03 April 2015 12:20 PM, Dinakar K wrote: > Hi Guys, > > We built a useful page about simple and profound python tricks and > got a very good response from reddit > > here is the link Simple and profound python tricks > Not > bad, but please fix your font color. > > we would love to get your feedback on the page and the type of > content you want us to add to the site > > please feel free to add your tricks > _______________________________________________ BangPypers mailing > list BangPypers at python.org > https://mail.python.org/mailman/listinfo/bangpypers > - -- Regards, - --Anand - ---------------------------- Software Architect/Consultant anandpillai at letterboxes.org Cell: +919880078014 -----BEGIN PGP SIGNATURE----- Version: GnuPG v1 Comment: Using GnuPG with Thunderbird - http://www.enigmail.net/ iQEcBAEBAgAGBQJVHlkyAAoJEHKU2n17CpvDkDcIAKWX90YI+L+qKIbCWZHPJ4C3 yNzanX8QBZ6mw7NckmE2gvptbIVi4HiL/7jIbhIbHRDUuREnsxqKd97l3IwlpiEr x8pCvjCcnhwukkI6wYQRYvLiVbs9gtRlgnj4fv6EUh2tKfRU+XBOFA9Gs5O5AWzq 0yNi3R6o84YKBYHWO/JPcjWk8R5a7ihSbOt0ogaUUfTNr0mQWHWhfL0QAYuya9I4 jpGHromhoO+xgJ9NZJWZyPobluSL4ZDvN+VE/fiiGyQXNsGkrXGYfZKCJzeQKmfR 59IMuAHIif63lC2WlLSwHJ3jKKhTe+IPzqjRXAhLNyufY4p9h8ki8r8rvhPXLfs= =T8Mb -----END PGP SIGNATURE----- From veraks18 at gmail.com Fri Apr 3 19:11:43 2015 From: veraks18 at gmail.com (Akshay Verma) Date: Fri, 3 Apr 2015 19:11:43 +0200 Subject: [BangPypers] Digitising text? Message-ID: I need information about the ways to digitise a text and their comparisons Best Regards, Akshay Verma. From kracethekingmaker at gmail.com Fri Apr 3 19:21:01 2015 From: kracethekingmaker at gmail.com (kracekumar ramaraju) Date: Fri, 3 Apr 2015 22:51:01 +0530 Subject: [BangPypers] Digitising text? In-Reply-To: References: Message-ID: Hi Akshay Please read mailing list etiquette, http://www.shakthimaan.com/downloads/glv/presentations/mailing-list-etiquette.pdf . One liners like these are hard to understand and help. On Fri, Apr 3, 2015 at 10:41 PM, Akshay Verma wrote: > I need information about the ways to digitise a text and their comparisons > > Best Regards, > Akshay Verma. > _______________________________________________ > BangPypers mailing list > BangPypers at python.org > https://mail.python.org/mailman/listinfo/bangpypers > -- *Thanks & Regardskracekumar"Talk is cheap, show me the code" -- Linus Torvaldshttp://kracekumar.com * From careers at doublespring.com Wed Apr 8 12:03:36 2015 From: careers at doublespring.com (Shuhaib Shariff) Date: Wed, 8 Apr 2015 15:33:36 +0530 Subject: [BangPypers] [JOBS] Python / Django Developer Message-ID: DoubleSpring seeks passionate DJANGO developers with experience in building or contributing to Open Source projects. We are constantly rolling out new products so the right individual would be able to write clean code and work in a fast pace environment. We value native ability, passion and the right attitude. Requirements - Technical proficiency with Python / Django. - Knowledge of JavaScript / AngularJS - Experience with MySQL / PgSQL. - Experience with MVC design patterns and solid algorithm skills. - 1-3 years of industry experience Location: Bangalore How to apply Send your resume to: careers [at] doublespring.com From jinsthomas at gmail.com Wed Apr 8 15:42:02 2015 From: jinsthomas at gmail.com (Jins Thomas) Date: Wed, 8 Apr 2015 19:12:02 +0530 Subject: [BangPypers] Python - Data Analysis - Book Suggestions? Message-ID: Hi all, I would like to request your suggestions on good books on Data Analysis in general and also on Big Data using Python Thank you so much for the support Jins Thomas From anand21nanda at gmail.com Wed Apr 8 16:44:29 2015 From: anand21nanda at gmail.com (Anand Reddy Pandikunta) Date: Wed, 8 Apr 2015 20:14:29 +0530 Subject: [BangPypers] Python - Data Analysis - Book Suggestions? In-Reply-To: References: Message-ID: check out http://shop.oreilly.com/product/0636920023784.do Regards, Anand Reddy Pandikunta www.avilpage.com On Wed, Apr 8, 2015 at 7:12 PM, Jins Thomas wrote: > Hi all, > > I would like to request your suggestions on good books on Data Analysis in > general and also on Big Data using Python > > > Thank you so much for the support > > > Jins Thomas > _______________________________________________ > BangPypers mailing list > BangPypers at python.org > https://mail.python.org/mailman/listinfo/bangpypers > From kracethekingmaker at gmail.com Sun Apr 12 08:26:15 2015 From: kracethekingmaker at gmail.com (kracekumar ramaraju) Date: Sun, 12 Apr 2015 11:56:15 +0530 Subject: [BangPypers] April Meetup Message-ID: Hi We're hosting April meetup on 25 in IBM Accelerator, Domlur. RSVP is open http://www.meetup.com/BangPypers/events/178049472/. We're looking for speakers. The session durations are of 15 and 30 minutes. The topics for the sessions can be anything about Python, Python tools, experience of using those tools etc... If you're interested to give a talk please drop a comment in the meetup page. -- *Thanks & Regardskracekumar"Talk is cheap, show me the code" -- Linus Torvaldshttp://kracekumar.com * From noufal at nibrahim.net.in Sun Apr 12 08:36:51 2015 From: noufal at nibrahim.net.in (Noufal Ibrahim KV) Date: Sun, 12 Apr 2015 12:06:51 +0530 Subject: [BangPypers] April Meetup In-Reply-To: (kracekumar ramaraju's message of "Sun, 12 Apr 2015 11:56:15 +0530") References: Message-ID: <87mw2dlunw.fsf@nibrahim.net.in> On Sun, Apr 12 2015, kracekumar ramaraju wrote: > Hi > > We're hosting April meetup on 25 in IBM Accelerator, Domlur. RSVP is open > http://www.meetup.com/BangPypers/events/178049472/. [...] Krace, Is it always on the last Saturday of the month? Thanks. -- Cordially, Noufal http://nibrahim.net.in From kracethekingmaker at gmail.com Sun Apr 12 09:02:03 2015 From: kracethekingmaker at gmail.com (kracekumar ramaraju) Date: Sun, 12 Apr 2015 12:32:03 +0530 Subject: [BangPypers] April Meetup In-Reply-To: <87mw2dlunw.fsf@nibrahim.net.in> References: <87mw2dlunw.fsf@nibrahim.net.in> Message-ID: On Sun, Apr 12, 2015 at 12:06 PM, Noufal Ibrahim KV wrote: > On Sun, Apr 12 2015, kracekumar ramaraju wrote: > > > Hi > > > > We're hosting April meetup on 25 in IBM Accelerator, Domlur. RSVP is > open > > http://www.meetup.com/BangPypers/events/178049472/. > > [...] > > Krace, > Is it always on the last Saturday of the month? > > No, it is third saturday day of the month, following same tradition. Unfortunately, this time we had hiccup with venue host. Thanks. > -- > Cordially, > Noufal > http://nibrahim.net.in > _______________________________________________ > BangPypers mailing list > BangPypers at python.org > https://mail.python.org/mailman/listinfo/bangpypers > -- *Thanks & Regardskracekumar"Talk is cheap, show me the code" -- Linus Torvaldshttp://kracekumar.com * From noufal at nibrahim.net.in Sun Apr 12 09:07:20 2015 From: noufal at nibrahim.net.in (Noufal Ibrahim KV) Date: Sun, 12 Apr 2015 12:37:20 +0530 Subject: [BangPypers] April Meetup In-Reply-To: (kracekumar ramaraju's message of "Sun, 12 Apr 2015 12:32:03 +0530") References: <87mw2dlunw.fsf@nibrahim.net.in> Message-ID: <87h9sllt93.fsf@nibrahim.net.in> On Sun, Apr 12 2015, kracekumar ramaraju wrote: [...] > No, it is third saturday day of the month, following same tradition. > Unfortunately, this time we had hiccup with venue host. [...] Cool. -- Cordially, Noufal http://nibrahim.net.in From anandpillai at letterboxes.org Mon Apr 13 07:57:37 2015 From: anandpillai at letterboxes.org (Anand B Pillai) Date: Mon, 13 Apr 2015 11:27:37 +0530 Subject: [BangPypers] Python - Data Analysis - Book Suggestions? In-Reply-To: References: Message-ID: <552B5AD1.8050109@letterboxes.org> -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 On Wednesday 08 April 2015 07:12 PM, Jins Thomas wrote: > Hi all, > > I would like to request your suggestions on good books on Data > Analysis in general and also on Big Data using Python I am learning this right now using "Python for data analysis" (Wes McKinney) by O'Reilly. I am having the 2013 edition. It teaches you stuff from the basics of numeric and scientific Python libraries. Recommended. > > > Thank you so much for the support > > > Jins Thomas _______________________________________________ > BangPypers mailing list BangPypers at python.org > https://mail.python.org/mailman/listinfo/bangpypers > - -- Regards, - --Anand - ---------------------------- Software Architect/Consultant anandpillai at letterboxes.org Cell: +919880078014 -----BEGIN PGP SIGNATURE----- Version: GnuPG v1 Comment: Using GnuPG with Thunderbird - http://www.enigmail.net/ iQEcBAEBAgAGBQJVK1rRAAoJEHKU2n17CpvDHEcIAMvO9eNm3jxtzGtXsxICuJd3 ZjUfu/ZWiBakZt3fO7b0G6ph2f1LeBVUpFLP0K+9olCpxRndj8IvIrqwuAcjEycq 5tHF7hTaGe366as9I2GiU5UN0tUHChSGZJ7DtIa5bTmwNmKTTSyTU7zNGOb6uvrv YGFTDfPCZ5A2JqEZxEFenWLyBiGMwIFsANDJvd8anfkPpkCNnCP3s354A4m5Vtrt e9LQO/x4jngztTlwLTywNx/skq3Kwk16jUfa4K+I3W07TS6kdb96Ukj4KwAd3S7h RbgR9Z2rRWIRbrK9bO7KZyLU5pHHZYVPKCSyj7p3eBQ6WRhHU7MFcvqlmTnjpTo= =UmKo -----END PGP SIGNATURE----- From mopuru.murali at gmail.com Mon Apr 13 13:50:09 2015 From: mopuru.murali at gmail.com (Murali Mopuru) Date: Mon, 13 Apr 2015 17:20:09 +0530 Subject: [BangPypers] JOB - VMware looking for Python/Django developers Message-ID: Hi All, If you are interested in VMware and have following skills, plz mail me your profile. 1. Very good python developer with any web framework development 2. Hands on experience in Linux/UNIX. You should be capable of doing deployments, debugging etc 3. Knowledge on network concepts and protocols like TCP,UDP,HTTP,SSL etc 4. Knowledge on cloud/virtualization systems concepts like hyperviser, VM, SAN, NAS, NFS etc *is plus* Job: You have build webapps for internal VMware's products build systems, dev tools etc. Best salary with amenities. For more info, plz check vmware careers. - Murali From madhusudhan.sunkara at gmail.com Wed Apr 15 07:53:42 2015 From: madhusudhan.sunkara at gmail.com (Madhusudhan sunkara) Date: Wed, 15 Apr 2015 11:23:42 +0530 Subject: [BangPypers] [JOBS] Oracle India - devops Message-ID: Oracle India looking for Devops people with following skillset *Desired Skills & Experience* *Education/Qualifications: *B.E./B.Tech/MS/Mtech/MCA/MSC in Computer Science or related technical field *Experience/Skill:*4-6years of relevant industry work experience, including 3+ years DevOps experience *Must have* ? Experience running mission-critical Linux production servers ? Experience with PaaS/SaaS operations. ? Direct experience with EC2, OpenStack or CloudStack ? Experience with configuration automation tools ( Chef, Puppet) ? Python/Perl/Ruby and Bash development expertise desirable ? Ability to work in a cross cultural environment ? Experience with virtualization technologies Xenserver/KVM/LXC, ? Experience coding to product APIs ? Demonstrate good judgment in solving problems as well as identifying problems in advance,and proposing solutions ? Knowledge in Issue Tracker workflows, wiki (i.e. JIRA, Confluence). ? Experience with systems and network monitoring and performance analysis using Nagios,Cacti, Graphite or others ? Experience managing SQL(Oracle/MySQL) and NoSql servers interested candidates can reply to this message -- Madhu Sudhan Sunkara From arvind.ks1985 at gmail.com Fri Apr 17 08:05:41 2015 From: arvind.ks1985 at gmail.com (arvind ks) Date: Fri, 17 Apr 2015 11:35:41 +0530 Subject: [BangPypers] Need help with adding a character at the end of the line in a file in linux. Message-ID: Hi Everyone, Could you please help me with the below python script that tries to add characters A H to line starting with "service description" depending on contact groups. *Input file::* define service{ use generic-service service_description CPU Load - check_command check_nrpe_cpu contact_groups Infrastructure,Platforms } *Script::* #!/usr/bin/python import re,sys,os,string f1=open("carpet.txt","r") f2=open("carpet1.txt","w") reg1=re.compile("^\s+contact_groups") reg2=re.compile(".*Infrastructure.*") reg3=re.compile(".*Platforms.*") reg4=re.compile("^\s+check_command") reg5=re.compile("^\s+service_description") lines=f1.readlines() length=len(lines) length_4_nl=length - 1 length_4_nnl=length - 2 for i in range(0,length,1): j = i - 1 k = i + 1 l = i + 2 cl = lines[i] pl = lines[j] if i < length_4_nl: nl = lines[k] if i < length_4_nnl: nnl = lines[l] if reg5.match(cl) and reg2.match(nl): f2.write(cl.rstrip("\n") + " H" + ("\n")) if reg5.match(cl) and reg3.match(nl): f2.write(cl.rstrip("\n") + " A" + ("\n")) #else: # f2.write(cl) f1.close() f2.close() *Output::* service_description NRPE Check Load - 1min/5min/15min - H service_description NRPE Check Load - 1min/5min/15min - A *Issue::* I want the output to be service_description NRPE Check Load - 1min/5min/15min - H A i.e. if the i have a tag "H" for Infrastructure and "A" for "Platforms". I am trying to append it to the same line rather than creating a new line. But it doesnt work. Please help From kishorbhat at gmail.com Fri Apr 17 08:11:25 2015 From: kishorbhat at gmail.com (Kishor Bhat) Date: Fri, 17 Apr 2015 11:41:25 +0530 Subject: [BangPypers] Need help with adding a character at the end of the line in a file in linux. In-Reply-To: References: Message-ID: On Fri, Apr 17, 2015 at 11:35 AM, arvind ks wrote: > Hi Everyone, > Could you please help me with the below python script that tries to add > characters A H to line starting with "service description" depending on > contact groups. > > *Input file::* > > define service{ > use generic-service > service_description CPU Load - > check_command check_nrpe_cpu > contact_groups Infrastructure,Platforms > } > > *Script::* > > #!/usr/bin/python > import re,sys,os,string > f1=open("carpet.txt","r") > f2=open("carpet1.txt","w") > reg1=re.compile("^\s+contact_groups") > reg2=re.compile(".*Infrastructure.*") > reg3=re.compile(".*Platforms.*") > reg4=re.compile("^\s+check_command") > reg5=re.compile("^\s+service_description") > lines=f1.readlines() > length=len(lines) > length_4_nl=length - 1 > length_4_nnl=length - 2 > > > for i in range(0,length,1): > j = i - 1 > k = i + 1 > l = i + 2 > cl = lines[i] > pl = lines[j] > > if i < length_4_nl: > nl = lines[k] > if i < length_4_nnl: > nnl = lines[l] > > if reg5.match(cl) and reg2.match(nl): > f2.write(cl.rstrip("\n") + " H" + ("\n")) > if reg5.match(cl) and reg3.match(nl): > f2.write(cl.rstrip("\n") + " A" + ("\n")) You're writing the line to the file twice. Try changing your conditions; something along the lines of 'check for "HA", else "H", else "A".' should work. Regards, Kishor > #else: > # f2.write(cl) > > f1.close() > f2.close() > > > *Output::* > > service_description NRPE Check Load - 1min/5min/15min - H > service_description NRPE Check Load - 1min/5min/15min - A > > *Issue::* > I want the output to be > > service_description NRPE Check Load - 1min/5min/15min - H A > > i.e. if the i have a tag "H" for Infrastructure and "A" for "Platforms". > I am trying to append it to the same line rather than creating a new line. > But it doesnt work. Please help > _______________________________________________ > BangPypers mailing list > BangPypers at python.org > https://mail.python.org/mailman/listinfo/bangpypers From arvind.ks1985 at gmail.com Fri Apr 17 08:39:46 2015 From: arvind.ks1985 at gmail.com (arvind ks) Date: Fri, 17 Apr 2015 12:09:46 +0530 Subject: [BangPypers] Need help with adding a character at the end of the line in a file in linux. In-Reply-To: References: Message-ID: Hi Kishore, Thanks for the reply. I've tried adding some additional checks, but unable to understand where i am making a mistake. Could you please check and advise. Input:: define service{ use generic-service service_description CPU Load - check_command check_nrpe_cpu contact_groups Infrastructure,Platforms } Script:: #!/usr/bin/python import re,sys,os,string f1=open("input.txt","r") f2=open("out.txt","w") reg1=re.compile("^\s+contact_groups") reg2=re.compile(".*Infrastructure.*") reg3=re.compile(".*Platforms.*") reg4=re.compile("^\s+check_command") reg5=re.compile("^\s+service_description") lines=f1.readlines() length=len(lines) length_4_nl=length - 1 length_4_nnl=length - 2 for i in range(0,length,1): j = i - 1 k = i + 1 l = i + 2 cl = lines[i] pl = lines[j] if i < length_4_nl: nl = lines[k] if i < length_4_nnl: nnl = lines[l] if reg5.match(cl): if reg5.match(cl) and reg2.match(nl) and reg4.match(nnl): f2.write(cl.rstrip("\n") + " H" + ("\n")) if reg5.match(cl) and reg2.match(nnl) and reg4.match(nl): f2.write(cl.rstrip("\n") + " H" + ("\n")) if reg5.match(cl) and reg3.match(nl) and reg4.match(nnl): f2.write(cl.rstrip("\n") + " A" + ("\n")) if reg5.match(cl) and reg3.match(nnl) and reg4.match(nl): f2.write(cl.rstrip("\n") + " A" + ("\n")) else: f2.write(cl) f1.close() f2.close() Output:: define service{ use generic-service service_description CPU Load - H service_description CPU Load - A check_command check_nrpe_cpu contact_groups Infrastructure,Platforms } I am again getting 2 seperate lines. It should ideally be in a single line i.e. H A as there are 2 entries in contact_groups lines i.e. Infrastructure and Platforms. Thanks Arvind On Fri, Apr 17, 2015 at 11:41 AM, Kishor Bhat wrote: > On Fri, Apr 17, 2015 at 11:35 AM, arvind ks > wrote: > > Hi Everyone, > > Could you please help me with the below python script that tries to add > > characters A H to line starting with "service description" depending on > > contact groups. > > > > *Input file::* > > > > define service{ > > use generic-service > > service_description CPU Load - > > check_command check_nrpe_cpu > > contact_groups Infrastructure,Platforms > > } > > > > *Script::* > > > > #!/usr/bin/python > > import re,sys,os,string > > f1=open("carpet.txt","r") > > f2=open("carpet1.txt","w") > > reg1=re.compile("^\s+contact_groups") > > reg2=re.compile(".*Infrastructure.*") > > reg3=re.compile(".*Platforms.*") > > reg4=re.compile("^\s+check_command") > > reg5=re.compile("^\s+service_description") > > lines=f1.readlines() > > length=len(lines) > > length_4_nl=length - 1 > > length_4_nnl=length - 2 > > > > > > for i in range(0,length,1): > > j = i - 1 > > k = i + 1 > > l = i + 2 > > cl = lines[i] > > pl = lines[j] > > > > if i < length_4_nl: > > nl = lines[k] > > if i < length_4_nnl: > > nnl = lines[l] > > > > if reg5.match(cl) and reg2.match(nl): > > f2.write(cl.rstrip("\n") + " H" + ("\n")) > > if reg5.match(cl) and reg3.match(nl): > > f2.write(cl.rstrip("\n") + " A" + ("\n")) > > You're writing the line to the file twice. > Try changing your conditions; something along the lines of 'check for > "HA", else "H", else "A".' should work. > > Regards, > Kishor > > > #else: > > # f2.write(cl) > > > > f1.close() > > f2.close() > > > > > > *Output::* > > > > service_description NRPE Check Load - 1min/5min/15min - H > > service_description NRPE Check Load - 1min/5min/15min - A > > > > *Issue::* > > I want the output to be > > > > service_description NRPE Check Load - 1min/5min/15min - H A > > > > i.e. if the i have a tag "H" for Infrastructure and "A" for "Platforms". > > I am trying to append it to the same line rather than creating a new > line. > > But it doesnt work. Please help > > _______________________________________________ > > BangPypers mailing list > > BangPypers at python.org > > https://mail.python.org/mailman/listinfo/bangpypers > _______________________________________________ > BangPypers mailing list > BangPypers at python.org > https://mail.python.org/mailman/listinfo/bangpypers > From prince09cs at gmail.com Fri Apr 17 09:16:12 2015 From: prince09cs at gmail.com (Prince Sharma) Date: Fri, 17 Apr 2015 12:46:12 +0530 Subject: [BangPypers] Need help with adding a character at the end of the line in a file in linux. In-Reply-To: References: Message-ID: For that you should compile the regex with infrastructure and platform together. And have that check before checks for infrastructure and platform separately. On Apr 17, 2015 12:10 PM, "arvind ks" wrote: > Hi Kishore, > Thanks for the reply. I've tried adding some additional checks, but unable > to understand where i am making a mistake. > Could you please check and advise. > > Input:: > > define service{ > use generic-service > service_description CPU Load - > check_command check_nrpe_cpu > contact_groups Infrastructure,Platforms > } > > Script:: > > #!/usr/bin/python > import re,sys,os,string > f1=open("input.txt","r") > f2=open("out.txt","w") > reg1=re.compile("^\s+contact_groups") > reg2=re.compile(".*Infrastructure.*") > reg3=re.compile(".*Platforms.*") > reg4=re.compile("^\s+check_command") > reg5=re.compile("^\s+service_description") > lines=f1.readlines() > length=len(lines) > length_4_nl=length - 1 > length_4_nnl=length - 2 > > > for i in range(0,length,1): > j = i - 1 > k = i + 1 > l = i + 2 > cl = lines[i] > pl = lines[j] > > if i < length_4_nl: > nl = lines[k] > if i < length_4_nnl: > nnl = lines[l] > > if reg5.match(cl): > if reg5.match(cl) and reg2.match(nl) and reg4.match(nnl): > f2.write(cl.rstrip("\n") + " H" + ("\n")) > if reg5.match(cl) and reg2.match(nnl) and reg4.match(nl): > f2.write(cl.rstrip("\n") + " H" + ("\n")) > if reg5.match(cl) and reg3.match(nl) and reg4.match(nnl): > f2.write(cl.rstrip("\n") + " A" + ("\n")) > if reg5.match(cl) and reg3.match(nnl) and reg4.match(nl): > f2.write(cl.rstrip("\n") + " A" + ("\n")) > else: > f2.write(cl) > > f1.close() > f2.close() > > > Output:: > define service{ > use generic-service > service_description CPU Load - H > service_description CPU Load - A > check_command check_nrpe_cpu > contact_groups Infrastructure,Platforms > } > > > > > I am again getting 2 seperate lines. It should ideally be in a single line > i.e. H A as there are 2 entries in contact_groups lines i.e. Infrastructure > and Platforms. > > Thanks > Arvind > > On Fri, Apr 17, 2015 at 11:41 AM, Kishor Bhat > wrote: > > > On Fri, Apr 17, 2015 at 11:35 AM, arvind ks > > wrote: > > > Hi Everyone, > > > Could you please help me with the below python script that tries to add > > > characters A H to line starting with "service description" depending on > > > contact groups. > > > > > > *Input file::* > > > > > > define service{ > > > use generic-service > > > service_description CPU Load - > > > check_command check_nrpe_cpu > > > contact_groups Infrastructure,Platforms > > > } > > > > > > *Script::* > > > > > > #!/usr/bin/python > > > import re,sys,os,string > > > f1=open("carpet.txt","r") > > > f2=open("carpet1.txt","w") > > > reg1=re.compile("^\s+contact_groups") > > > reg2=re.compile(".*Infrastructure.*") > > > reg3=re.compile(".*Platforms.*") > > > reg4=re.compile("^\s+check_command") > > > reg5=re.compile("^\s+service_description") > > > lines=f1.readlines() > > > length=len(lines) > > > length_4_nl=length - 1 > > > length_4_nnl=length - 2 > > > > > > > > > for i in range(0,length,1): > > > j = i - 1 > > > k = i + 1 > > > l = i + 2 > > > cl = lines[i] > > > pl = lines[j] > > > > > > if i < length_4_nl: > > > nl = lines[k] > > > if i < length_4_nnl: > > > nnl = lines[l] > > > > > > if reg5.match(cl) and reg2.match(nl): > > > f2.write(cl.rstrip("\n") + " H" + ("\n")) > > > if reg5.match(cl) and reg3.match(nl): > > > f2.write(cl.rstrip("\n") + " A" + ("\n")) > > > > You're writing the line to the file twice. > > Try changing your conditions; something along the lines of 'check for > > "HA", else "H", else "A".' should work. > > > > Regards, > > Kishor > > > > > #else: > > > # f2.write(cl) > > > > > > f1.close() > > > f2.close() > > > > > > > > > *Output::* > > > > > > service_description NRPE Check Load - 1min/5min/15min - H > > > service_description NRPE Check Load - 1min/5min/15min - A > > > > > > *Issue::* > > > I want the output to be > > > > > > service_description NRPE Check Load - 1min/5min/15min - H A > > > > > > i.e. if the i have a tag "H" for Infrastructure and "A" for > "Platforms". > > > I am trying to append it to the same line rather than creating a new > > line. > > > But it doesnt work. Please help > > > _______________________________________________ > > > BangPypers mailing list > > > BangPypers at python.org > > > https://mail.python.org/mailman/listinfo/bangpypers > > _______________________________________________ > > BangPypers mailing list > > BangPypers at python.org > > https://mail.python.org/mailman/listinfo/bangpypers > > > _______________________________________________ > BangPypers mailing list > BangPypers at python.org > https://mail.python.org/mailman/listinfo/bangpypers > From arvind.ks1985 at gmail.com Fri Apr 17 09:38:20 2015 From: arvind.ks1985 at gmail.com (arvind ks) Date: Fri, 17 Apr 2015 13:08:20 +0530 Subject: [BangPypers] Need help with adding a character at the end of the line in a file in linux. In-Reply-To: References: Message-ID: Hi Prince, In my case i have several of those service description entries where sometime both Infrastructure and Platform are present and sometimes only Infrastructure or Platform is present. Hence i didn't create a regex with both combined. Thanks Arvind On Fri, Apr 17, 2015 at 12:46 PM, Prince Sharma wrote: > For that you should compile the regex with infrastructure and platform > together. And have that check before checks for infrastructure and platform > separately. > On Apr 17, 2015 12:10 PM, "arvind ks" wrote: > > > Hi Kishore, > > Thanks for the reply. I've tried adding some additional checks, but > unable > > to understand where i am making a mistake. > > Could you please check and advise. > > > > Input:: > > > > define service{ > > use generic-service > > service_description CPU Load - > > check_command check_nrpe_cpu > > contact_groups Infrastructure,Platforms > > } > > > > Script:: > > > > #!/usr/bin/python > > import re,sys,os,string > > f1=open("input.txt","r") > > f2=open("out.txt","w") > > reg1=re.compile("^\s+contact_groups") > > reg2=re.compile(".*Infrastructure.*") > > reg3=re.compile(".*Platforms.*") > > reg4=re.compile("^\s+check_command") > > reg5=re.compile("^\s+service_description") > > lines=f1.readlines() > > length=len(lines) > > length_4_nl=length - 1 > > length_4_nnl=length - 2 > > > > > > for i in range(0,length,1): > > j = i - 1 > > k = i + 1 > > l = i + 2 > > cl = lines[i] > > pl = lines[j] > > > > if i < length_4_nl: > > nl = lines[k] > > if i < length_4_nnl: > > nnl = lines[l] > > > > if reg5.match(cl): > > if reg5.match(cl) and reg2.match(nl) and reg4.match(nnl): > > f2.write(cl.rstrip("\n") + " H" + ("\n")) > > if reg5.match(cl) and reg2.match(nnl) and reg4.match(nl): > > f2.write(cl.rstrip("\n") + " H" + ("\n")) > > if reg5.match(cl) and reg3.match(nl) and reg4.match(nnl): > > f2.write(cl.rstrip("\n") + " A" + ("\n")) > > if reg5.match(cl) and reg3.match(nnl) and reg4.match(nl): > > f2.write(cl.rstrip("\n") + " A" + ("\n")) > > else: > > f2.write(cl) > > > > f1.close() > > f2.close() > > > > > > Output:: > > define service{ > > use generic-service > > service_description CPU Load - H > > service_description CPU Load - A > > check_command check_nrpe_cpu > > contact_groups Infrastructure,Platforms > > } > > > > > > > > > > I am again getting 2 seperate lines. It should ideally be in a single > line > > i.e. H A as there are 2 entries in contact_groups lines i.e. > Infrastructure > > and Platforms. > > > > Thanks > > Arvind > > > > On Fri, Apr 17, 2015 at 11:41 AM, Kishor Bhat > > wrote: > > > > > On Fri, Apr 17, 2015 at 11:35 AM, arvind ks > > > wrote: > > > > Hi Everyone, > > > > Could you please help me with the below python script that tries to > add > > > > characters A H to line starting with "service description" depending > on > > > > contact groups. > > > > > > > > *Input file::* > > > > > > > > define service{ > > > > use generic-service > > > > service_description CPU Load - > > > > check_command check_nrpe_cpu > > > > contact_groups Infrastructure,Platforms > > > > } > > > > > > > > *Script::* > > > > > > > > #!/usr/bin/python > > > > import re,sys,os,string > > > > f1=open("carpet.txt","r") > > > > f2=open("carpet1.txt","w") > > > > reg1=re.compile("^\s+contact_groups") > > > > reg2=re.compile(".*Infrastructure.*") > > > > reg3=re.compile(".*Platforms.*") > > > > reg4=re.compile("^\s+check_command") > > > > reg5=re.compile("^\s+service_description") > > > > lines=f1.readlines() > > > > length=len(lines) > > > > length_4_nl=length - 1 > > > > length_4_nnl=length - 2 > > > > > > > > > > > > for i in range(0,length,1): > > > > j = i - 1 > > > > k = i + 1 > > > > l = i + 2 > > > > cl = lines[i] > > > > pl = lines[j] > > > > > > > > if i < length_4_nl: > > > > nl = lines[k] > > > > if i < length_4_nnl: > > > > nnl = lines[l] > > > > > > > > if reg5.match(cl) and reg2.match(nl): > > > > f2.write(cl.rstrip("\n") + " H" + ("\n")) > > > > if reg5.match(cl) and reg3.match(nl): > > > > f2.write(cl.rstrip("\n") + " A" + ("\n")) > > > > > > You're writing the line to the file twice. > > > Try changing your conditions; something along the lines of 'check for > > > "HA", else "H", else "A".' should work. > > > > > > Regards, > > > Kishor > > > > > > > #else: > > > > # f2.write(cl) > > > > > > > > f1.close() > > > > f2.close() > > > > > > > > > > > > *Output::* > > > > > > > > service_description NRPE Check Load - 1min/5min/15min - H > > > > service_description NRPE Check Load - 1min/5min/15min - A > > > > > > > > *Issue::* > > > > I want the output to be > > > > > > > > service_description NRPE Check Load - 1min/5min/15min - H A > > > > > > > > i.e. if the i have a tag "H" for Infrastructure and "A" for > > "Platforms". > > > > I am trying to append it to the same line rather than creating a new > > > line. > > > > But it doesnt work. Please help > > > > _______________________________________________ > > > > BangPypers mailing list > > > > BangPypers at python.org > > > > https://mail.python.org/mailman/listinfo/bangpypers > > > _______________________________________________ > > > BangPypers mailing list > > > BangPypers at python.org > > > https://mail.python.org/mailman/listinfo/bangpypers > > > > > _______________________________________________ > > BangPypers mailing list > > BangPypers at python.org > > https://mail.python.org/mailman/listinfo/bangpypers > > > _______________________________________________ > BangPypers mailing list > BangPypers at python.org > https://mail.python.org/mailman/listinfo/bangpypers > From jinsthomas at gmail.com Fri Apr 17 10:23:22 2015 From: jinsthomas at gmail.com (Jins Thomas) Date: Fri, 17 Apr 2015 13:53:22 +0530 Subject: [BangPypers] Python - Data Analysis - Book Suggestions? In-Reply-To: <552B5AD1.8050109@letterboxes.org> References: <552B5AD1.8050109@letterboxes.org> Message-ID: > On Wednesday 08 April 2015 07:12 PM, Jins Thomas wrote: > > Hi all, > > > > I would like to request your suggestions on good books on Data > > Analysis in general and also on Big Data using Python > > I am learning this right now using > "Python for data analysis" (Wes McKinney) by O'Reilly. > > I am having the 2013 edition. > > It teaches you stuff from the basics of numeric and scientific Python > libraries. Recommended. > > Thanks all. Very much appreciated! > > > Jins Thomas _______________________________________________ > > > - -- > Regards, > > - --Anand > > - ---------------------------- > Software Architect/Consultant > anandpillai at letterboxes.org > > Cell: +919880078014 > -----BEGIN PGP SIGNATURE----- > Version: GnuPG v1 > Comment: Using GnuPG with Thunderbird - http://www.enigmail.net/ > > iQEcBAEBAgAGBQJVK1rRAAoJEHKU2n17CpvDHEcIAMvO9eNm3jxtzGtXsxICuJd3 > ZjUfu/ZWiBakZt3fO7b0G6ph2f1LeBVUpFLP0K+9olCpxRndj8IvIrqwuAcjEycq > 5tHF7hTaGe366as9I2GiU5UN0tUHChSGZJ7DtIa5bTmwNmKTTSyTU7zNGOb6uvrv > YGFTDfPCZ5A2JqEZxEFenWLyBiGMwIFsANDJvd8anfkPpkCNnCP3s354A4m5Vtrt > e9LQO/x4jngztTlwLTywNx/skq3Kwk16jUfa4K+I3W07TS6kdb96Ukj4KwAd3S7h > RbgR9Z2rRWIRbrK9bO7KZyLU5pHHZYVPKCSyj7p3eBQ6WRhHU7MFcvqlmTnjpTo= > =UmKo > -----END PGP SIGNATURE----- > _______________________________________________ > BangPypers mailing list > BangPypers at python.org > https://mail.python.org/mailman/listinfo/bangpypers > From joepaulp at gmail.com Sun Apr 19 21:20:55 2015 From: joepaulp at gmail.com (Joe Paul) Date: Sun, 19 Apr 2015 21:20:55 +0200 Subject: [BangPypers] from: Joe Paul Message-ID: <405E0E7E-7C77-4200-99DB-B65EA7F8AA31@fuse.net> Hi bangpypers http://sensationorganic.com/injury.php?bad=n69yy79uhpu Joe Paul Sent from my iPhone From ashek70 at gmail.com Mon Apr 20 16:07:48 2015 From: ashek70 at gmail.com (Abishek) Date: Mon, 20 Apr 2015 19:37:48 +0530 Subject: [BangPypers] fresher jobs help Message-ID: Myself Abishek ,2015 B.E passout (computer science and engineering ) fresher, looking for python based jobs in bangalore.If anyone know job opportunities pls let me know. I am so much interested in python programming,networks and security... From noufal at nibrahim.net.in Mon Apr 20 17:27:40 2015 From: noufal at nibrahim.net.in (Noufal Ibrahim KV) Date: Mon, 20 Apr 2015 20:57:40 +0530 Subject: [BangPypers] fresher jobs help In-Reply-To: (Abishek's message of "Mon, 20 Apr 2015 19:37:48 +0530") References: Message-ID: <874moaom4z.fsf@nibrahim.net.in> On Mon, Apr 20 2015, Abishek wrote: > Myself Abishek ,2015 B.E passout (computer science and engineering ) > fresher, looking for python based jobs in bangalore.If anyone know job > opportunities pls let me know. I am so much interested in python > programming,networks and security... [...] Try https://hasjob.co/?l=bangalore&q=python -- Cordially, Noufal http://nibrahim.net.in From kishanmehta3 at gmail.com Wed Apr 22 08:33:14 2015 From: kishanmehta3 at gmail.com (Kishan Mehta) Date: Wed, 22 Apr 2015 12:03:14 +0530 Subject: [BangPypers] Host a web application. Message-ID: Hi , I have to show my web application project work to a potential employer. Is it possible to host the demo somewhere? Any guide on this? Thanks for help, Kishankumar Mehta From rajiv.m1991 at gmail.com Wed Apr 22 08:35:34 2015 From: rajiv.m1991 at gmail.com (Rajiv Subramanian M) Date: Wed, 22 Apr 2015 06:35:34 +0000 Subject: [BangPypers] Host a web application. In-Reply-To: References: Message-ID: Try openshift.com or heroku.com both I've tried and are good From shrayasr at gmail.com Wed Apr 22 08:36:21 2015 From: shrayasr at gmail.com (Shrayas rajagopal) Date: Wed, 22 Apr 2015 12:06:21 +0530 Subject: [BangPypers] Host a web application. In-Reply-To: References: Message-ID: On Wed, Apr 22, 2015 at 12:03 PM, Kishan Mehta wrote: > I have to show my web application project work to a potential employer. > Is it possible to host the demo somewhere? Any guide on this? The easiest way to do this would be to use a service like heroku[1]. They provide a great free package that you can use for throwaway projects like what you mention. It is as simple as creating what they call a ProcFile and adding heroku as a git remote and pushing to them. Check it out. [1]:http://heroku.com/ From me at bibhas.in Wed Apr 22 08:37:27 2015 From: me at bibhas.in (Bibhas Ch Debnath) Date: Wed, 22 Apr 2015 12:07:27 +0530 Subject: [BangPypers] Host a web application. In-Reply-To: References: Message-ID: On Apr 22, 2015 12:03 PM, "Kishan Mehta" wrote: > > Hi , > > I have to show my web application project work to a potential employer. > Is it possible to host the demo somewhere? Any guide on this? If you just have to show it for a brief period, use ngrok instead of hosting it somewhere. It takes your local instance and makes it available online as long as you want. > > Thanks for help, > Kishankumar Mehta > _______________________________________________ > BangPypers mailing list > BangPypers at python.org > https://mail.python.org/mailman/listinfo/bangpypers From shrayasr at gmail.com Wed Apr 22 08:38:29 2015 From: shrayasr at gmail.com (Shrayas rajagopal) Date: Wed, 22 Apr 2015 12:08:29 +0530 Subject: [BangPypers] Host a web application. In-Reply-To: References: Message-ID: On Wed, Apr 22, 2015 at 12:07 PM, Bibhas Ch Debnath wrote: > If you just have to show it for a brief period, use ngrok instead of > hosting it somewhere. It takes your local instance and makes it available > online as long as you want. +1 for ngrok[1]. Simple and straightforward! [1]: https://ngrok.com/ From kishanmehta3 at gmail.com Wed Apr 22 08:53:38 2015 From: kishanmehta3 at gmail.com (Kishan Mehta) Date: Wed, 22 Apr 2015 12:23:38 +0530 Subject: [BangPypers] Host a web application. In-Reply-To: References: Message-ID: Ok Thanks.... On Apr 22, 2015 12:09 PM, "Shrayas rajagopal" wrote: > On Wed, Apr 22, 2015 at 12:07 PM, Bibhas Ch Debnath wrote: > > If you just have to show it for a brief period, use ngrok instead of > > hosting it somewhere. It takes your local instance and makes it available > > online as long as you want. > > +1 for ngrok[1]. Simple and straightforward! > > [1]: https://ngrok.com/ > _______________________________________________ > BangPypers mailing list > BangPypers at python.org > https://mail.python.org/mailman/listinfo/bangpypers > From nimish.s.dalal at gmail.com Fri Apr 24 08:46:03 2015 From: nimish.s.dalal at gmail.com (Nimish Dalal) Date: Fri, 24 Apr 2015 12:16:03 +0530 Subject: [BangPypers] Newspaper in Python Message-ID: Hi, I am new to python and need help with newspaper. I am using this module as I find it easier to extract the urls from the website. Here's my code: import newspaper Sc_paper = newspaper.build(u'http://scroll.in/') for article in Sc_paper.articles: print(article.url) http://scroll.in/... http://scroll.in/... Instead of print I want the urls to export as a .txt file or .csv file. Thanks in advance. -- Nimish S. Dalal Cell: 9819670499 Facebook: http://www.facebook.com/nimish.s.dalal Twitter: http://twitter.com/nimishdalal Linkedin:http://in.linkedin.com/in/nimishsdll Url: http://www.nimishdalal.me Our generation has had no Great Depression, no Great War. Our war is a spiritual war. Our great depression is our lives. From dmjan21 at gmail.com Fri Apr 24 08:51:04 2015 From: dmjan21 at gmail.com (Dhawal Joharapurkar) Date: Fri, 24 Apr 2015 12:21:04 +0530 Subject: [BangPypers] Newspaper in Python In-Reply-To: References: Message-ID: This is a simple problem. 1. Create a file handle. 2. Write to the file using the file handle. 3. Close the file handle. import newspaper outfile = open('outfile.txt', 'a+) # Create a file handle Sc_paper = newspaper.build(u'http://scroll.in/') for article in Sc_paper.articles: print(article.url) outfile.write(article) # Write to file outfile.write('\n') outfile.close() # Close the file handle On Fri, Apr 24, 2015 at 12:16 PM, Nimish Dalal wrote: > Hi, I am new to python and need help with newspaper. > I am using this module as I find it easier to extract the urls from the > website. > > Here's my code: > > import newspaper > Sc_paper = newspaper.build(u'http://scroll.in/') > for article in Sc_paper.articles: > print(article.url) > http://scroll.in/... > http://scroll.in/... > > Instead of print I want the urls to export as a .txt file or .csv file. > > Thanks in advance. > > > -- > Nimish S. Dalal > Cell: 9819670499 > Facebook: http://www.facebook.com/nimish.s.dalal > Twitter: http://twitter.com/nimishdalal > Linkedin:http://in.linkedin.com/in/nimishsdll > Url: http://www.nimishdalal.me > > Our generation has had no Great Depression, no Great War. Our war is a > spiritual war. Our great depression is our lives. > _______________________________________________ > BangPypers mailing list > BangPypers at python.org > https://mail.python.org/mailman/listinfo/bangpypers > From prince09cs at gmail.com Fri Apr 24 08:59:41 2015 From: prince09cs at gmail.com (Prince Sharma) Date: Fri, 24 Apr 2015 12:29:41 +0530 Subject: [BangPypers] Newspaper in Python In-Reply-To: References: Message-ID: By export you mean you want to create a text file for every URL? Hi, I am new to python and need help with newspaper. I am using this module as I find it easier to extract the urls from the website. Here's my code: import newspaper Sc_paper = newspaper.build(u'http://scroll.in/') for article in Sc_paper.articles: print(article.url) http://scroll.in/... http://scroll.in/... Instead of print I want the urls to export as a .txt file or .csv file. Thanks in advance. -- Nimish S. Dalal Cell: 9819670499 Facebook: http://www.facebook.com/nimish.s.dalal Twitter: http://twitter.com/nimishdalal Linkedin:http://in.linkedin.com/in/nimishsdll Url: http://www.nimishdalal.me Our generation has had no Great Depression, no Great War. Our war is a spiritual war. Our great depression is our lives. _______________________________________________ BangPypers mailing list BangPypers at python.org https://mail.python.org/mailman/listinfo/bangpypers From sshabinesh at gmail.com Fri Apr 24 09:18:34 2015 From: sshabinesh at gmail.com (sshabinesh at gmail.com) Date: Fri, 24 Apr 2015 12:48:34 +0530 Subject: [BangPypers] Newspaper in Python In-Reply-To: References: Message-ID: -OR- import newspaper articles = [ ] Sc_paper = newspaper.build(u'http://scroll.in/') for article in Sc_paper.articles: articles.append(article) with open("filename", "w") as f: f.write( "\n".join(articles) ) On Fri, Apr 24, 2015 at 12:29 PM, Prince Sharma wrote: > By export you mean you want to create a text file for every URL? > Hi, I am new to python and need help with newspaper. > I am using this module as I find it easier to extract the urls from the > website. > > Here's my code: > > import newspaper > Sc_paper = newspaper.build(u'http://scroll.in/') > for article in Sc_paper.articles: > print(article.url) > http://scroll.in/... > http://scroll.in/... > > Instead of print I want the urls to export as a .txt file or .csv file. > > Thanks in advance. > > > -- > Nimish S. Dalal > Cell: 9819670499 > Facebook: http://www.facebook.com/nimish.s.dalal > Twitter: http://twitter.com/nimishdalal > Linkedin:http://in.linkedin.com/in/nimishsdll > Url: http://www.nimishdalal.me > > Our generation has had no Great Depression, no Great War. Our war is a > spiritual war. Our great depression is our lives. > _______________________________________________ > BangPypers mailing list > BangPypers at python.org > https://mail.python.org/mailman/listinfo/bangpypers > _______________________________________________ > BangPypers mailing list > BangPypers at python.org > https://mail.python.org/mailman/listinfo/bangpypers > From nimish.s.dalal at gmail.com Fri Apr 24 09:47:37 2015 From: nimish.s.dalal at gmail.com (Nimish Dalal) Date: Fri, 24 Apr 2015 13:17:37 +0530 Subject: [BangPypers] Newspaper in Python In-Reply-To: References: Message-ID: On Fri, Apr 24, 2015 at 12:21 PM, Dhawal Joharapurkar wrote: > This is a simple problem. > > 1. Create a file handle. > 2. Write to the file using the file handle. > 3. Close the file handle. > > import newspaper > outfile = open('outfile.txt', 'a+) # Create a file handle > > Sc_paper = newspaper.build(u'http://scroll.in/') > for article in Sc_paper.articles: > print(article.url) > outfile.write(article) # Write to file > outfile.write('\n') > > outfile.close() # Close the file handle > Thanks Dhawal, I tried this code and it gives me a Type error import newspaper outfile = open('I:\Python\outfile.txt', 'a+') # Create a file handle Sc_paper = newspaper.build(u'http://scroll.in/') for article in Sc_paper.articles: print(article.url) outfile.write(article) # Write to file outfile.write('\n') outfile.close() TypeError: expected a character buffer object. > On Fri, Apr 24, 2015 at 12:16 PM, Nimish Dalal > wrote: > > > Hi, I am new to python and need help with newspaper. > > I am using this module as I find it easier to extract the urls from the > > website. > > > > Here's my code: > > > > import newspaper > > Sc_paper = newspaper.build(u'http://scroll.in/') > > for article in Sc_paper.articles: > > print(article.url) > > http://scroll.in/... > > http://scroll.in/... > > > > Instead of print I want the urls to export as a .txt file or .csv file. > > > > Thanks in advance. > > > > > > -- > > Nimish S. Dalal > > Cell: 9819670499 > > Facebook: http://www.facebook.com/nimish.s.dalal > > Twitter: http://twitter.com/nimishdalal > > > Linkedin:http://in.linkedin.com/in/nimishsdll > > Url: http://www.nimishdalal.me > > > > Our generation has had no Great Depression, no Great War. Our war is a > > spiritual war. Our great depression is our lives. > > _______________________________________________ > > BangPypers mailing list > > BangPypers at python.org > > https://mail.python.org/mailman/listinfo/bangpypers > > > _______________________________________________ > BangPypers mailing list > BangPypers at python.org > https://mail.python.org/mailman/listinfo/bangpypers > -- Nimish S. Dalal Cell: 9819670499 Facebook: http://www.facebook.com/nimish.s.dalal Twitter: http://twitter.com/nimishdalal Linkedin:http://in.linkedin.com/in/nimishsdll Url: http://www.nimishdalal.me Our generation has had no Great Depression, no Great War. Our war is a spiritual war. Our great depression is our lives. From nimish.s.dalal at gmail.com Fri Apr 24 09:49:33 2015 From: nimish.s.dalal at gmail.com (Nimish Dalal) Date: Fri, 24 Apr 2015 13:19:33 +0530 Subject: [BangPypers] Newspaper in Python In-Reply-To: References: Message-ID: On Fri, Apr 24, 2015 at 12:29 PM, Prince Sharma wrote: > By export you mean you want to create a text file for every URL? > Hey Prince, I want all the urls to be compiled in a text file. Hi, I am new to python and need help with newspaper. > I am using this module as I find it easier to extract the urls from the > website. > > Here's my code: > > import newspaper > Sc_paper = newspaper.build(u'http://scroll.in/') > for article in Sc_paper.articles: > print(article.url) > http://scroll.in/... > http://scroll.in/... > > Instead of print I want the urls to export as a .txt file or .csv file. > > Thanks in advance. > > > -- > Nimish S. Dalal > Cell: 9819670499 > Facebook: http://www.facebook.com/nimish.s.dalal > Twitter: http://twitter.com/nimishdalal > Linkedin:http://in.linkedin.com/in/nimishsdll > Url: http://www.nimishdalal.me > > Our generation has had no Great Depression, no Great War. Our war is a > spiritual war. Our great depression is our lives. > _______________________________________________ > BangPypers mailing list > BangPypers at python.org > https://mail.python.org/mailman/listinfo/bangpypers > _______________________________________________ > BangPypers mailing list > BangPypers at python.org > https://mail.python.org/mailman/listinfo/bangpypers > -- Nimish S. Dalal Cell: 9819670499 Facebook: http://www.facebook.com/nimish.s.dalal Twitter: http://twitter.com/nimishdalal Linkedin:http://in.linkedin.com/in/nimishsdll Url: http://www.nimishdalal.me Our generation has had no Great Depression, no Great War. Our war is a spiritual war. Our great depression is our lives. From nimish.s.dalal at gmail.com Fri Apr 24 09:53:31 2015 From: nimish.s.dalal at gmail.com (Nimish Dalal) Date: Fri, 24 Apr 2015 13:23:31 +0530 Subject: [BangPypers] Newspaper in Python In-Reply-To: References: Message-ID: On Fri, Apr 24, 2015 at 12:48 PM, sshabinesh at gmail.com wrote: > -OR- > > import newspaper > > articles = [ ] > Sc_paper = newspaper.build(u'http://scroll.in/') > > for article in Sc_paper.articles: > articles.append(article) > > with open("filename", "w") as f: > f.write( "\n".join(articles) ) > > Thanks Shabinesh, I tried this code but gives me an error TypeError: sequence item 0: expected string, Article found On Fri, Apr 24, 2015 at 12:29 PM, Prince Sharma > wrote: > > > By export you mean you want to create a text file for every URL? > > Hi, I am new to python and need help with newspaper. > > I am using this module as I find it easier to extract the urls from the > > website. > > > > Here's my code: > > > > import newspaper > > Sc_paper = newspaper.build(u'http://scroll.in/') > > for article in Sc_paper.articles: > > print(article.url) > > http://scroll.in/... > > http://scroll.in/... > > > > Instead of print I want the urls to export as a .txt file or .csv file. > > > > Thanks in advance. > > > > > > -- > > Nimish S. Dalal > > Cell: 9819670499 > > Facebook: http://www.facebook.com/nimish.s.dalal > > Twitter: http://twitter.com/nimishdalal > > > Linkedin:http://in.linkedin.com/in/nimishsdll > > Url: http://www.nimishdalal.me > > > > Our generation has had no Great Depression, no Great War. Our war is a > > spiritual war. Our great depression is our lives. > > _______________________________________________ > > BangPypers mailing list > > BangPypers at python.org > > https://mail.python.org/mailman/listinfo/bangpypers > > _______________________________________________ > > BangPypers mailing list > > BangPypers at python.org > > https://mail.python.org/mailman/listinfo/bangpypers > > > _______________________________________________ > BangPypers mailing list > BangPypers at python.org > https://mail.python.org/mailman/listinfo/bangpypers > -- Nimish S. Dalal Cell: 9819670499 Facebook: http://www.facebook.com/nimish.s.dalal Twitter: http://twitter.com/nimishdalal Linkedin:http://in.linkedin.com/in/nimishsdll Url: http://www.nimishdalal.me Our generation has had no Great Depression, no Great War. Our war is a spiritual war. Our great depression is our lives. From dmjan21 at gmail.com Fri Apr 24 09:56:30 2015 From: dmjan21 at gmail.com (Dhawal Joharapurkar) Date: Fri, 24 Apr 2015 13:26:30 +0530 Subject: [BangPypers] Newspaper in Python In-Reply-To: References: Message-ID: import newspaper outfile = open('outfile.txt', 'a+) # Create a file handle Sc_paper = newspaper.build(u'http://scroll.in/') for article in Sc_paper.articles: print(article.url) outfile.write(article.url) # Write to file outfile.write('\n') outfile.close() # Close the file handle This should work. On Fri, Apr 24, 2015 at 1:23 PM, Nimish Dalal wrote: > On Fri, Apr 24, 2015 at 12:48 PM, sshabinesh at gmail.com < > sshabinesh at gmail.com > > wrote: > > > -OR- > > > > import newspaper > > > > articles = [ ] > > Sc_paper = newspaper.build(u'http://scroll.in/') > > > > for article in Sc_paper.articles: > > articles.append(article) > > > > with open("filename", "w") as f: > > f.write( "\n".join(articles) ) > > > > Thanks Shabinesh, > > I tried this code but gives me an error > TypeError: sequence item 0: expected string, Article found > > On Fri, Apr 24, 2015 at 12:29 PM, Prince Sharma > > wrote: > > > > > By export you mean you want to create a text file for every URL? > > > Hi, I am new to python and need help with newspaper. > > > I am using this module as I find it easier to extract the urls from the > > > website. > > > > > > Here's my code: > > > > > > import newspaper > > > Sc_paper = newspaper.build(u'http://scroll.in/') > > > for article in Sc_paper.articles: > > > print(article.url) > > > http://scroll.in/... > > > http://scroll.in/... > > > > > > Instead of print I want the urls to export as a .txt file or .csv > file. > > > > > > Thanks in advance. > > > > > > > > > -- > > > Nimish S. Dalal > > > Cell: 9819670499 > > > Facebook: http://www.facebook.com/nimish.s.dalal > > > Twitter: http://twitter.com/nimishdalal < > http://twitter.com/nimishsdalal > > > > > > Linkedin:http://in.linkedin.com/in/nimishsdll > > > Url: http://www.nimishdalal.me > > > > > > Our generation has had no Great Depression, no Great War. Our war is a > > > spiritual war. Our great depression is our lives. > > > _______________________________________________ > > > BangPypers mailing list > > > BangPypers at python.org > > > https://mail.python.org/mailman/listinfo/bangpypers > > > _______________________________________________ > > > BangPypers mailing list > > > BangPypers at python.org > > > https://mail.python.org/mailman/listinfo/bangpypers > > > > > _______________________________________________ > > BangPypers mailing list > > BangPypers at python.org > > https://mail.python.org/mailman/listinfo/bangpypers > > > > > > -- > Nimish S. Dalal > Cell: 9819670499 > Facebook: http://www.facebook.com/nimish.s.dalal > Twitter: http://twitter.com/nimishdalal > Linkedin:http://in.linkedin.com/in/nimishsdll > Url: http://www.nimishdalal.me > > Our generation has had no Great Depression, no Great War. Our war is a > spiritual war. Our great depression is our lives. > _______________________________________________ > BangPypers mailing list > BangPypers at python.org > https://mail.python.org/mailman/listinfo/bangpypers > From nimish.s.dalal at gmail.com Fri Apr 24 10:08:50 2015 From: nimish.s.dalal at gmail.com (Nimish Dalal) Date: Fri, 24 Apr 2015 13:38:50 +0530 Subject: [BangPypers] Newspaper in Python In-Reply-To: References: Message-ID: On Fri, Apr 24, 2015 at 1:26 PM, Dhawal Joharapurkar wrote: > import newspaper > outfile = open('outfile.txt', 'a+) # Create a file handle > > Sc_paper = newspaper.build(u'http://scroll.in/') > for article in Sc_paper.articles: > print(article.url) > outfile.write(article.url) # Write to file > outfile.write('\n') > > outfile.close() # Close the file handle > > This should work. > > Wow! That actually worked for me. Thank you very much. I appreciate that. > On Fri, Apr 24, 2015 at 1:23 PM, Nimish Dalal > wrote: > > > On Fri, Apr 24, 2015 at 12:48 PM, sshabinesh at gmail.com < > > sshabinesh at gmail.com > > > wrote: > > > > > -OR- > > > > > > import newspaper > > > > > > articles = [ ] > > > Sc_paper = newspaper.build(u'http://scroll.in/') > > > > > > for article in Sc_paper.articles: > > > articles.append(article) > > > > > > with open("filename", "w") as f: > > > f.write( "\n".join(articles) ) > > > > > > Thanks Shabinesh, > > > > I tried this code but gives me an error > > TypeError: sequence item 0: expected string, Article found > > > > On Fri, Apr 24, 2015 at 12:29 PM, Prince Sharma > > > wrote: > > > > > > > By export you mean you want to create a text file for every URL? > > > > Hi, I am new to python and need help with newspaper. > > > > I am using this module as I find it easier to extract the urls from > the > > > > website. > > > > > > > > Here's my code: > > > > > > > > import newspaper > > > > Sc_paper = newspaper.build(u'http://scroll.in/') > > > > for article in Sc_paper.articles: > > > > print(article.url) > > > > http://scroll.in/... > > > > http://scroll.in/... > > > > > > > > Instead of print I want the urls to export as a .txt file or .csv > > file. > > > > > > > > Thanks in advance. > > > > > > > > > > > > -- > > > > Nimish S. Dalal > > > > Cell: 9819670499 > > > > Facebook: http://www.facebook.com/nimish.s.dalal > > > > Twitter: http://twitter.com/nimishdalal < > > http://twitter.com/nimishsdalal > > > > > > > > Linkedin:http://in.linkedin.com/in/nimishsdll > > > > Url: http://www.nimishdalal.me > > > > > > > > Our generation has had no Great Depression, no Great War. Our war is > a > > > > spiritual war. Our great depression is our lives. > > > > _______________________________________________ > > > > BangPypers mailing list > > > > BangPypers at python.org > > > > https://mail.python.org/mailman/listinfo/bangpypers > > > > _______________________________________________ > > > > BangPypers mailing list > > > > BangPypers at python.org > > > > https://mail.python.org/mailman/listinfo/bangpypers > > > > > > > _______________________________________________ > > > BangPypers mailing list > > > BangPypers at python.org > > > https://mail.python.org/mailman/listinfo/bangpypers > > > > > > > > > > > -- > > Nimish S. Dalal > > Cell: 9819670499 > > Facebook: http://www.facebook.com/nimish.s.dalal > > Twitter: http://twitter.com/nimishdalal > > > Linkedin:http://in.linkedin.com/in/nimishsdll > > Url: http://www.nimishdalal.me > > > > Our generation has had no Great Depression, no Great War. Our war is a > > spiritual war. Our great depression is our lives. > > _______________________________________________ > > BangPypers mailing list > > BangPypers at python.org > > https://mail.python.org/mailman/listinfo/bangpypers > > > _______________________________________________ > BangPypers mailing list > BangPypers at python.org > https://mail.python.org/mailman/listinfo/bangpypers > -- Nimish S. Dalal Cell: 9819670499 Facebook: http://www.facebook.com/nimish.s.dalal Twitter: http://twitter.com/nimishdalal Linkedin:http://in.linkedin.com/in/nimishsdll Url: http://www.nimishdalal.me Our generation has had no Great Depression, no Great War. Our war is a spiritual war. Our great depression is our lives. From prince09cs at gmail.com Fri Apr 24 10:14:27 2015 From: prince09cs at gmail.com (Prince Sharma) Date: Fri, 24 Apr 2015 13:44:27 +0530 Subject: [BangPypers] Newspaper in Python In-Reply-To: References: Message-ID: As mentioned you can create a list of all the URLs. Something like: import newspaper fh = open("filetosave","w") Sc_paper = newspaper.build(u'http://scroll.in/') for article in Sc_paper.articles: print(article.url) fh,write(str(article.url)) http://scroll.in/. .. http://scroll.in/. .. On Apr 24, 2015 1:20 PM, "Nimish Dalal" wrote: > On Fri, Apr 24, 2015 at 12:29 PM, Prince Sharma > wrote: > > > By export you mean you want to create a text file for every URL? > > > > Hey Prince, > I want all the urls to be compiled in a text file. > > Hi, I am new to python and need help with newspaper. > > I am using this module as I find it easier to extract the urls from the > > website. > > > > Here's my code: > > > > import newspaper > > Sc_paper = newspaper.build(u'http://scroll.in/') > > for article in Sc_paper.articles: > > print(article.url) > > http://scroll.in/... > > http://scroll.in/... > > > > Instead of print I want the urls to export as a .txt file or .csv file. > > > > Thanks in advance. > > > > > > -- > > Nimish S. Dalal > > Cell: 9819670499 > > Facebook: http://www.facebook.com/nimish.s.dalal > > Twitter: http://twitter.com/nimishdalal > > > Linkedin:http://in.linkedin.com/in/nimishsdll > > Url: http://www.nimishdalal.me > > > > Our generation has had no Great Depression, no Great War. Our war is a > > spiritual war. Our great depression is our lives. > > _______________________________________________ > > BangPypers mailing list > > BangPypers at python.org > > https://mail.python.org/mailman/listinfo/bangpypers > > _______________________________________________ > > BangPypers mailing list > > BangPypers at python.org > > https://mail.python.org/mailman/listinfo/bangpypers > > > > > > -- > Nimish S. Dalal > Cell: 9819670499 > Facebook: http://www.facebook.com/nimish.s.dalal > Twitter: http://twitter.com/nimishdalal > Linkedin:http://in.linkedin.com/in/nimishsdll > Url: http://www.nimishdalal.me > > Our generation has had no Great Depression, no Great War. Our war is a > spiritual war. Our great depression is our lives. > _______________________________________________ > BangPypers mailing list > BangPypers at python.org > https://mail.python.org/mailman/listinfo/bangpypers > From nimish.s.dalal at gmail.com Fri Apr 24 10:35:10 2015 From: nimish.s.dalal at gmail.com (Nimish Dalal) Date: Fri, 24 Apr 2015 14:05:10 +0530 Subject: [BangPypers] Newspaper in Python In-Reply-To: References: Message-ID: On Fri, Apr 24, 2015 at 1:44 PM, Prince Sharma wrote: > As mentioned you can create a list of all the URLs. > > Something like: > > > import newspaper > > fh = open("filetosave","w") > > Sc_paper = newspaper.build(u'http://scroll.in/') > > for article in Sc_paper.articles: > print(article.url) > > fh,write(str(article.url)) > > http://scroll.in/. .. > http://scroll.in/. .. > > Thank you Prince. That was helpful. I appreciate all you people who spared your time to resolve the issue. Someday I think I will be intelligent enuf to share my knowledge and resolve noob's problems. Haha.. > On Apr 24, 2015 1:20 PM, "Nimish Dalal" wrote: > > > On Fri, Apr 24, 2015 at 12:29 PM, Prince Sharma > > wrote: > > > > > By export you mean you want to create a text file for every URL? > > > > > > > Hey Prince, > > I want all the urls to be compiled in a text file. > > > > Hi, I am new to python and need help with newspaper. > > > I am using this module as I find it easier to extract the urls from the > > > website. > > > > > > Here's my code: > > > > > > import newspaper > > > Sc_paper = newspaper.build(u'http://scroll.in/') > > > for article in Sc_paper.articles: > > > print(article.url) > > > http://scroll.in/... > > > http://scroll.in/... > > > > > > Instead of print I want the urls to export as a .txt file or .csv > file. > > > > > > Thanks in advance. > > > > > > > > > -- > > > Nimish S. Dalal > > > Cell: 9819670499 > > > Facebook: http://www.facebook.com/nimish.s.dalal > > > Twitter: http://twitter.com/nimishdalal < > http://twitter.com/nimishsdalal > > > > > > Linkedin:http://in.linkedin.com/in/nimishsdll > > > Url: http://www.nimishdalal.me > > > > > > Our generation has had no Great Depression, no Great War. Our war is a > > > spiritual war. Our great depression is our lives. > > > _______________________________________________ > > > BangPypers mailing list > > > BangPypers at python.org > > > https://mail.python.org/mailman/listinfo/bangpypers > > > _______________________________________________ > > > BangPypers mailing list > > > BangPypers at python.org > > > https://mail.python.org/mailman/listinfo/bangpypers > > > > > > > > > > > -- > > Nimish S. Dalal > > Cell: 9819670499 > > Facebook: http://www.facebook.com/nimish.s.dalal > > Twitter: http://twitter.com/nimishdalal > > > Linkedin:http://in.linkedin.com/in/nimishsdll > > Url: http://www.nimishdalal.me > > > > Our generation has had no Great Depression, no Great War. Our war is a > > spiritual war. Our great depression is our lives. > > _______________________________________________ > > BangPypers mailing list > > BangPypers at python.org > > https://mail.python.org/mailman/listinfo/bangpypers > > > _______________________________________________ > BangPypers mailing list > BangPypers at python.org > https://mail.python.org/mailman/listinfo/bangpypers > -- Nimish S. Dalal Cell: 9819670499 Facebook: http://www.facebook.com/nimish.s.dalal Twitter: http://twitter.com/nimishdalal Linkedin:http://in.linkedin.com/in/nimishsdll Url: http://www.nimishdalal.me Our generation has had no Great Depression, no Great War. Our war is a spiritual war. Our great depression is our lives. From prince09cs at gmail.com Fri Apr 24 10:48:40 2015 From: prince09cs at gmail.com (Prince Sharma) Date: Fri, 24 Apr 2015 14:18:40 +0530 Subject: [BangPypers] Newspaper in Python In-Reply-To: References: Message-ID: No problem, however i forgot to close the file handle. You can start solving problem from there ;) Cheers, Prince On Fri, Apr 24, 2015 at 2:05 PM, Nimish Dalal wrote: > On Fri, Apr 24, 2015 at 1:44 PM, Prince Sharma > wrote: > > > As mentioned you can create a list of all the URLs. > > > > Something like: > > > > > > import newspaper > > > > fh = open("filetosave","w") > > > > Sc_paper = newspaper.build(u'http://scroll.in/') > > > > for article in Sc_paper.articles: > > print(article.url) > > > > fh,write(str(article.url)) > > > > http://scroll.in/. .. > > http://scroll.in/. .. > > > > > Thank you Prince. That was helpful. > > I appreciate all you people who spared your time to resolve the issue. > > Someday I think I will be intelligent enuf to share my knowledge and > resolve noob's problems. Haha.. > > > > > On Apr 24, 2015 1:20 PM, "Nimish Dalal" > wrote: > > > > > On Fri, Apr 24, 2015 at 12:29 PM, Prince Sharma > > > wrote: > > > > > > > By export you mean you want to create a text file for every URL? > > > > > > > > > > Hey Prince, > > > I want all the urls to be compiled in a text file. > > > > > > Hi, I am new to python and need help with newspaper. > > > > I am using this module as I find it easier to extract the urls from > the > > > > website. > > > > > > > > Here's my code: > > > > > > > > import newspaper > > > > Sc_paper = newspaper.build(u'http://scroll.in/') > > > > for article in Sc_paper.articles: > > > > print(article.url) > > > > http://scroll.in/... > > > > http://scroll.in/... > > > > > > > > Instead of print I want the urls to export as a .txt file or .csv > > file. > > > > > > > > Thanks in advance. > > > > > > > > > > > > -- > > > > Nimish S. Dalal > > > > Cell: 9819670499 > > > > Facebook: http://www.facebook.com/nimish.s.dalal > > > > Twitter: http://twitter.com/nimishdalal < > > http://twitter.com/nimishsdalal > > > > > > > > Linkedin:http://in.linkedin.com/in/nimishsdll > > > > Url: http://www.nimishdalal.me > > > > > > > > Our generation has had no Great Depression, no Great War. Our war is > a > > > > spiritual war. Our great depression is our lives. > > > > _______________________________________________ > > > > BangPypers mailing list > > > > BangPypers at python.org > > > > https://mail.python.org/mailman/listinfo/bangpypers > > > > _______________________________________________ > > > > BangPypers mailing list > > > > BangPypers at python.org > > > > https://mail.python.org/mailman/listinfo/bangpypers > > > > > > > > > > > > > > > > -- > > > Nimish S. Dalal > > > Cell: 9819670499 > > > Facebook: http://www.facebook.com/nimish.s.dalal > > > Twitter: http://twitter.com/nimishdalal < > http://twitter.com/nimishsdalal > > > > > > Linkedin:http://in.linkedin.com/in/nimishsdll > > > Url: http://www.nimishdalal.me > > > > > > Our generation has had no Great Depression, no Great War. Our war is a > > > spiritual war. Our great depression is our lives. > > > _______________________________________________ > > > BangPypers mailing list > > > BangPypers at python.org > > > https://mail.python.org/mailman/listinfo/bangpypers > > > > > _______________________________________________ > > BangPypers mailing list > > BangPypers at python.org > > https://mail.python.org/mailman/listinfo/bangpypers > > > > > > -- > Nimish S. Dalal > Cell: 9819670499 > Facebook: http://www.facebook.com/nimish.s.dalal > Twitter: http://twitter.com/nimishdalal > Linkedin:http://in.linkedin.com/in/nimishsdll > Url: http://www.nimishdalal.me > > Our generation has had no Great Depression, no Great War. Our war is a > spiritual war. Our great depression is our lives. > _______________________________________________ > BangPypers mailing list > BangPypers at python.org > https://mail.python.org/mailman/listinfo/bangpypers > From mandarvaze at gmail.com Fri Apr 24 11:44:53 2015 From: mandarvaze at gmail.com (=?UTF-8?B?TWFuZGFyIFZhemUgLyDgpK7gpILgpKbgpL7gpLAg4KS14KSd4KWH?=) Date: Fri, 24 Apr 2015 15:14:53 +0530 Subject: [BangPypers] Newspaper in Python In-Reply-To: References: Message-ID: If you use "with" context manager then you don't have to explicitly close the file handle. e.g. with open("filetosave","w") as fh: for article in Sc_paper.articles: fh.write(article) -Mandar On Fri, Apr 24, 2015 at 2:18 PM, Prince Sharma wrote: > No problem, however i forgot to close the file handle. You can start > solving problem from there ;) > > Cheers, > Prince > > On Fri, Apr 24, 2015 at 2:05 PM, Nimish Dalal > wrote: > > > On Fri, Apr 24, 2015 at 1:44 PM, Prince Sharma > > wrote: > > > > > As mentioned you can create a list of all the URLs. > > > > > > Something like: > > > > > > > > > import newspaper > > > > > > fh = open("filetosave","w") > > > > > > Sc_paper = newspaper.build(u'http://scroll.in/') > > > > > > for article in Sc_paper.articles: > > > print(article.url) > > > > > > fh,write(str(article.url)) > > > > > > http://scroll.in/. .. > > > http://scroll.in/. .. > > > > > > > > Thank you Prince. That was helpful. > > > > I appreciate all you people who spared your time to resolve the issue. > > > > Someday I think I will be intelligent enuf to share my knowledge and > > resolve noob's problems. Haha.. > > > > > > > > > On Apr 24, 2015 1:20 PM, "Nimish Dalal" > > wrote: > > > > > > > On Fri, Apr 24, 2015 at 12:29 PM, Prince Sharma < > prince09cs at gmail.com> > > > > wrote: > > > > > > > > > By export you mean you want to create a text file for every URL? > > > > > > > > > > > > > Hey Prince, > > > > I want all the urls to be compiled in a text file. > > > > > > > > Hi, I am new to python and need help with newspaper. > > > > > I am using this module as I find it easier to extract the urls from > > the > > > > > website. > > > > > > > > > > Here's my code: > > > > > > > > > > import newspaper > > > > > Sc_paper = newspaper.build(u'http://scroll.in/') > > > > > for article in Sc_paper.articles: > > > > > print(article.url) > > > > > http://scroll.in/... > > > > > http://scroll.in/... > > > > > > > > > > Instead of print I want the urls to export as a .txt file or .csv > > > file. > > > > > > > > > > Thanks in advance. > > > > > > > > > > > > > > > -- > > > > > Nimish S. Dalal > > > > > Cell: 9819670499 > > > > > Facebook: http://www.facebook.com/nimish.s.dalal > > > > > Twitter: http://twitter.com/nimishdalal < > > > http://twitter.com/nimishsdalal > > > > > > > > > > Linkedin:http://in.linkedin.com/in/nimishsdll > > > > > Url: http://www.nimishdalal.me > > > > > > > > > > Our generation has had no Great Depression, no Great War. Our war > is > > a > > > > > spiritual war. Our great depression is our lives. > > > > > _______________________________________________ > > > > > BangPypers mailing list > > > > > BangPypers at python.org > > > > > https://mail.python.org/mailman/listinfo/bangpypers > > > > > _______________________________________________ > > > > > BangPypers mailing list > > > > > BangPypers at python.org > > > > > https://mail.python.org/mailman/listinfo/bangpypers > > > > > > > > > > > > > > > > > > > > > -- > > > > Nimish S. Dalal > > > > Cell: 9819670499 > > > > Facebook: http://www.facebook.com/nimish.s.dalal > > > > Twitter: http://twitter.com/nimishdalal < > > http://twitter.com/nimishsdalal > > > > > > > > Linkedin:http://in.linkedin.com/in/nimishsdll > > > > Url: http://www.nimishdalal.me > > > > > > > > Our generation has had no Great Depression, no Great War. Our war is > a > > > > spiritual war. Our great depression is our lives. > > > > _______________________________________________ > > > > BangPypers mailing list > > > > BangPypers at python.org > > > > https://mail.python.org/mailman/listinfo/bangpypers > > > > > > > _______________________________________________ > > > BangPypers mailing list > > > BangPypers at python.org > > > https://mail.python.org/mailman/listinfo/bangpypers > > > > > > > > > > > -- > > Nimish S. Dalal > > Cell: 9819670499 > > Facebook: http://www.facebook.com/nimish.s.dalal > > Twitter: http://twitter.com/nimishdalal > > > Linkedin:http://in.linkedin.com/in/nimishsdll > > Url: http://www.nimishdalal.me > > > > Our generation has had no Great Depression, no Great War. Our war is a > > spiritual war. Our great depression is our lives. > > _______________________________________________ > > BangPypers mailing list > > BangPypers at python.org > > https://mail.python.org/mailman/listinfo/bangpypers > > > _______________________________________________ > BangPypers mailing list > BangPypers at python.org > https://mail.python.org/mailman/listinfo/bangpypers > From nimish.s.dalal at gmail.com Fri Apr 24 11:47:39 2015 From: nimish.s.dalal at gmail.com (Nimish Dalal) Date: Fri, 24 Apr 2015 15:17:39 +0530 Subject: [BangPypers] Newspaper in Python In-Reply-To: References: Message-ID: Thanks Mandaar. On Apr 24, 2015 3:15 PM, "Mandar Vaze / ????? ???" wrote: > If you use "with" context manager then you don't have to explicitly close > the file handle. > > e.g. > > with open("filetosave","w") as fh: > for article in Sc_paper.articles: > fh.write(article) > > > > -Mandar > > On Fri, Apr 24, 2015 at 2:18 PM, Prince Sharma > wrote: > > > No problem, however i forgot to close the file handle. You can start > > solving problem from there ;) > > > > Cheers, > > Prince > > > > On Fri, Apr 24, 2015 at 2:05 PM, Nimish Dalal > > wrote: > > > > > On Fri, Apr 24, 2015 at 1:44 PM, Prince Sharma > > > wrote: > > > > > > > As mentioned you can create a list of all the URLs. > > > > > > > > Something like: > > > > > > > > > > > > import newspaper > > > > > > > > fh = open("filetosave","w") > > > > > > > > Sc_paper = newspaper.build(u'http://scroll.in/') > > > > > > > > for article in Sc_paper.articles: > > > > print(article.url) > > > > > > > > fh,write(str(article.url)) > > > > > > > > http://scroll.in/. .. > > > > http://scroll.in/. .. > > > > > > > > > > > Thank you Prince. That was helpful. > > > > > > I appreciate all you people who spared your time to resolve the issue. > > > > > > Someday I think I will be intelligent enuf to share my knowledge and > > > resolve noob's problems. Haha.. > > > > > > > > > > > > > On Apr 24, 2015 1:20 PM, "Nimish Dalal" > > > wrote: > > > > > > > > > On Fri, Apr 24, 2015 at 12:29 PM, Prince Sharma < > > prince09cs at gmail.com> > > > > > wrote: > > > > > > > > > > > By export you mean you want to create a text file for every URL? > > > > > > > > > > > > > > > > Hey Prince, > > > > > I want all the urls to be compiled in a text file. > > > > > > > > > > Hi, I am new to python and need help with newspaper. > > > > > > I am using this module as I find it easier to extract the urls > from > > > the > > > > > > website. > > > > > > > > > > > > Here's my code: > > > > > > > > > > > > import newspaper > > > > > > Sc_paper = newspaper.build(u'http://scroll.in/') > > > > > > for article in Sc_paper.articles: > > > > > > print(article.url) > > > > > > http://scroll.in/... > > > > > > http://scroll.in/... > > > > > > > > > > > > Instead of print I want the urls to export as a .txt file or > .csv > > > > file. > > > > > > > > > > > > Thanks in advance. > > > > > > > > > > > > > > > > > > -- > > > > > > Nimish S. Dalal > > > > > > Cell: 9819670499 > > > > > > Facebook: http://www.facebook.com/nimish.s.dalal > > > > > > Twitter: http://twitter.com/nimishdalal < > > > > http://twitter.com/nimishsdalal > > > > > > > > > > > > Linkedin:http://in.linkedin.com/in/nimishsdll > > > > > > Url: http://www.nimishdalal.me > > > > > > > > > > > > Our generation has had no Great Depression, no Great War. Our war > > is > > > a > > > > > > spiritual war. Our great depression is our lives. > > > > > > _______________________________________________ > > > > > > BangPypers mailing list > > > > > > BangPypers at python.org > > > > > > https://mail.python.org/mailman/listinfo/bangpypers > > > > > > _______________________________________________ > > > > > > BangPypers mailing list > > > > > > BangPypers at python.org > > > > > > https://mail.python.org/mailman/listinfo/bangpypers > > > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > Nimish S. Dalal > > > > > Cell: 9819670499 > > > > > Facebook: http://www.facebook.com/nimish.s.dalal > > > > > Twitter: http://twitter.com/nimishdalal < > > > http://twitter.com/nimishsdalal > > > > > > > > > > Linkedin:http://in.linkedin.com/in/nimishsdll > > > > > Url: http://www.nimishdalal.me > > > > > > > > > > Our generation has had no Great Depression, no Great War. Our war > is > > a > > > > > spiritual war. Our great depression is our lives. > > > > > _______________________________________________ > > > > > BangPypers mailing list > > > > > BangPypers at python.org > > > > > https://mail.python.org/mailman/listinfo/bangpypers > > > > > > > > > _______________________________________________ > > > > BangPypers mailing list > > > > BangPypers at python.org > > > > https://mail.python.org/mailman/listinfo/bangpypers > > > > > > > > > > > > > > > > -- > > > Nimish S. Dalal > > > Cell: 9819670499 > > > Facebook: http://www.facebook.com/nimish.s.dalal > > > Twitter: http://twitter.com/nimishdalal < > http://twitter.com/nimishsdalal > > > > > > Linkedin:http://in.linkedin.com/in/nimishsdll > > > Url: http://www.nimishdalal.me > > > > > > Our generation has had no Great Depression, no Great War. Our war is a > > > spiritual war. Our great depression is our lives. > > > _______________________________________________ > > > BangPypers mailing list > > > BangPypers at python.org > > > https://mail.python.org/mailman/listinfo/bangpypers > > > > > _______________________________________________ > > BangPypers mailing list > > BangPypers at python.org > > https://mail.python.org/mailman/listinfo/bangpypers > > > _______________________________________________ > BangPypers mailing list > BangPypers at python.org > https://mail.python.org/mailman/listinfo/bangpypers > From sunil at planmytour.in Sun Apr 26 15:04:55 2015 From: sunil at planmytour.in (Sunil Gupta) Date: Sun, 26 Apr 2015 18:34:55 +0530 Subject: [BangPypers] looking for Python Django Developer by a 10000 Startup company Message-ID: *PlanMyTour : A 10000 Startup company (powered by Nasscom)* * looking for Python Django Developer.* *Job Description:* You would be responsible to work across technologies and cross platform with advance concepts like Python Django, Strong Algorithms, Google API etc. Also you would part of the team working with SOAP, Web Services, Payment Gateways, MySQL, Android, iOS, Telecom and what not? 90% of our work is dynamic (server side) with lots of real time business logic and a strong PL/SQL programming. Experience: 1 to 10 years *Work Culture:* Our work culture is die for work and live for fun and enjoy each moment of life. Our team love hearing to each other?s idea, quarrel on suggestions and appreciate best of ideas. We love cheering a glass of beer after smashing targets. And we have a mid term appraisal cycle also. *Perk* Equity Based Employment, Stock options, Health Insurance, LTA, Task based yearly Bonus, Joining Bonus, Telephone allowance, accommodation if needed, recreation club membership (swimming pool, tennis court, gym & indoor sports), *Salary:* More than industry standard, Equity Based Employment, Stock options ... *About the company:* *A 10000 Startup company (powered by Nasscom)* Plan My Tour (unit of BiRam Technologies Pvt Ltd) is a cloud based technology platform that facilitates traveller to customize their tour plan. This is first of its kind of product in the tourism industry and you are going to be part of team in an innovative company. We are a start-up backed by our technical team that is managed by experts which has working experience of 14 years in MNCs like TCS, Honeywell and HCL in various technology like Java, C++, Python, VC++, Web Services and various web development tools etc. We know you will take pride to see your product being used by millions of customers in two years. Send you resume to sunil at planmytour.in Thanks Sunil 09008524726 From mahuya at gkhrconsulting.com Tue Apr 28 13:14:37 2015 From: mahuya at gkhrconsulting.com (Mahuya ) Date: Tue, 28 Apr 2015 16:44:37 +0530 Subject: [BangPypers] Opportunities related to Python! Please acknowledge. Message-ID: Dear All! Writing you this note regarding a Senior Python Developer (8-16 years) Position with one of our Client - American R & D Multinational for their Bangalore operation. In this regard, I am talking to Highly skilled Python Developers from the industry . Do let me know if you have any reference for me. I can share more details 1:1 upon request. Thanks & Regards Mahuya email: mahuya at gkhrconsulting.com/mahuya.gkc at gmail.com Website: www.gkhrconsulting.com --- This email has been checked for viruses by Avast antivirus software. http://www.avast.com From arunvr at gmail.com Tue Apr 28 16:16:22 2015 From: arunvr at gmail.com (Arun Ravindran) Date: Tue, 28 Apr 2015 19:46:22 +0530 Subject: [BangPypers] [ANN] Book on "Django Design Patterns and Best Practices" and Giveaways Message-ID: *** Promotional Post *** Hi, I am Arun Ravindran. I writes articles on Django at http://arunrocks.com . Recently my book titled "Django Design Patterns and Best Practices" got published. It is written for intermediate and advanced Django developers. You can know more about the book in my previous blog post . We are running a Twitter contest to give away four free copies. The first question has been tweeted here: https://twitter.com/arocks/status/593006841002598401 The book can be purchased from various online outlets . The paperback edition is imported so it might be expensive. The ebook version is quite affordable at Rs. 456. If you have any suggestions or questions about the book, please feel free to write to me. Thanks, Arun