Czy wiesz, bot Libre zapewnia również 3D avatary i bezpłatny web występ w API?
Bugs and Features : capitalisation .not. Capitalisation

RE: capitalisation .not. Capitalisation

w wrapper wysłany Oct 29 2014, 6:47

RE: capitalisation .not. Capitalisation

A: Yes, formulas current capitalize the first work of a sentence. I will look into having an option to change this.


Thanks for that, it is at the word comparison level that it needs to be more fuzzy. The "Do I capitalise" should be processed as an action on that task.

I have though a lot about which further ability would most enhance the bots. I would also like to say how impressed I am by the progress you have made, especially in giving the bots abilities to parse web pages, learn flexibly and ease of human interaction. So it is important to be as efficient as possible.

I believe your bots have a good basis "instinctive action & learning ability" to use as a see for a much more complex interaction and deeper learning capability, without much further programming. Those skills can be self learned.

I can see now, you have a (very) lot of the work done, but limited in flexibility and hard coded.

Humans look at a whole word when comparing words, on "how alike they are".

An additional, learn able ability would be for bot to have a further parameter it can measure, when it can't identify a word. This would give the bot the closest word (it has available), that also has the closest amount of "the same letters", "letters in the same position", number of letters" or other relevant parameter.

From : FTC Discussion

The learned ability could then be applied else where for instance spelling error detectors. optomise is ok, but type optamise and it will not find it, however most of the letters are the same!

Notice, with this first simple skill, the bot out performs any current spell checker, and can ask for further "functions" to try (test, reduce)

These could be added generically, with the formula (semi-self learned / or compile self learned algorithm, associated with the parameter.)

The bot would learn a weighting, for the parameter. In the fist case this would be a single global weighting, but that could be stored in the talk decision tree.

For instance "a mood" (generically created with same parameter handling template), would be associated with the weightings of the parameter at that point. The mood could be reduced during sleep (data reduction) by data reduction of the number of the parameters needed to give the best response to the current goals (again generically the same)

This will give 2 major advantages, with just the single word recognition parameter of number of letters in the correct position, the amount of training will half. It will not matter how you hard code the typing afterwards (you are the bots type writer, it is a limitation of the system they need a tool to over come, a self learned tool.)


Identyfikator: 481872
Wysłany: Oct 29 2014, 6:47
Odpowiedzi: 0
Widok: 2085, dziś: 1, tydzień: 8, miesiąc: 48
0 0 0.0/5