Poppins Posted May 12, 2015 Report Posted May 12, 2015 Here is the link to the page. I'm going to have it up for a little while. Maybe a few days as long as my computer doesn't slow down too much.I need testers for this. This is the first version of it, and it's not pretty but hopefully it does what it's supposed to do. So please, go to that link and enter some information into the text area and hit submit. Let's see what ends up happening.The goal is to get enough information from every zip code entered into the system so I can send it along to the politicians who are in office or running for office. I think that they would like to see what the people need.It's one step beyond voting that more people may like to partake in. Quote
sanctus Posted May 13, 2015 Report Posted May 13, 2015 Did not really get what it is supposed so I clicked to check it out and got:Tunnel aa4899f.ngrok.com not found Quote
Poppins Posted May 13, 2015 Author Report Posted May 13, 2015 I left work yesterday and forgot to put it up overnight :l. I wish I remembered. It's back up now... for the time being. Quote
sanctus Posted May 15, 2015 Report Posted May 15, 2015 It is up now, cool idea!Since I am not from the US, ther is no point for me to fill something in. But looking at the top 10 you should (even if you say disregard these) just add so called "stop words" to your language precessing algorithm to remove them. I do not know which algorithm you use, but adding a list of tokens/tuples/n-tuples to ignore should be straight forward and there are such lists already readily available on the net. Quote
Poppins Posted May 15, 2015 Author Report Posted May 15, 2015 I tried one method to remove the fragments, but it ended up doing something like this. I need money -> became a fragment of "I need money now" -> became a fragment of "I need money now really bad". I guess I could tokenize the fragments and get rid of some of them that way, but it's still so early in its development that I haven't even thought about it. It's basically just a pattern recognizer, but the cool thing is that once it's where I want it to be it's going to be global. It will recognize any pattern, in any orthography, in any location. I'm going to be taking IP addresses and checking a database for where they are located (by city) and prioritize the input for the city, for the state, for the country, and for the world depending on the language. My idea of Scientocracy is not just about changing my own country, it's about getting people what they feel that they need no matter where they are located. Quote
Poppins Posted May 15, 2015 Author Report Posted May 15, 2015 (edited) but adding a list of tokens/tuples/n-tuples to ignore should be straight forward and there are such lists already readily available on the net. I don't understand what you mean here by tuples and n-tuples. I've already posted the algorithm here before as well. Here it is, essentially, but spiced up so that it's fun to look at if you put it in the python shell. while True: poi = raw_input('DNA> ') npoi = '' while len(poi + npoi) != 0: if len(poi) > len(npoi): print ' ' * len(npoi), '0=' + '=' * (len(poi)-len(npoi)) + '=0' else: print ' ' * len(poi), '0=' + '=' * (len(npoi)-len(poi)) + '=0' if len(poi) != 0: npoi = poi[-1] + npoi if len(poi) == 1: poi = '' else: poi = poi[:-1] if len(poi) == 0: poi = npoi[1:] npoi = '' Edited May 15, 2015 by Poppins Quote
Poppins Posted May 18, 2015 Author Report Posted May 18, 2015 The program has been updated for local, statewide, and national inputs (although I think that it only works for USA right now). Quote
sanctus Posted May 19, 2015 Report Posted May 19, 2015 A list of stopwords at the bottom of this page:https://www.elastic.co/guide/en/elasticsearch/guide/current/stopwords.html But this section is also something to read:https://www.elastic.co/guide/en/elasticsearch/guide/master/pros-cons-stopwords.html So you could write your own list, such as to keep "not" for example. With tuples I mean combos of words which also can be automatically removed, like "I think" (a touple) or a 3-tuple like "I think we" from "I think we need more tuples", since only "need more tuples" is important. Quote
Poppins Posted May 19, 2015 Author Report Posted May 19, 2015 If you tried to access the webpage yesterday it probably took forever to load. I'm using IP addresses to localize the inputs to states and ISP areas right now. I entered an IP from Norway and it took a while to check the database. Once it finally printed out the results, everything said "none". So, if you're from Norway, then in the eyes of my program you are from the city of None and the state of None, but I do think that you can still vote for the USA, which I will have to fix. I'm going to do that now. Quote
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.