llmtuner.llms.gpt ================= .. py:module:: llmtuner.llms.gpt Classes ------- .. autoapisummary:: llmtuner.llms.gpt.GPTChatter Module Contents --------------- .. py:class:: GPTChatter(filename='') .. py:attribute:: name :value: 'plaingpt' .. py:attribute:: history :value: [] .. py:method:: chat(prompt, usehisto=0, store=True, printstyle=False) Getting a response from the client, * 'usehisto': adding the number of prior exchanges as chat history to the query * 'store': storing query and result to history * 'printstyle': if True returning answer as linebreaked markdown .. !! processed by numpydoc !! .. py:method:: reset_history() .. py:method:: get_last_answer() .. py:method:: save_history(outfilename) .. py:method:: load_history(infilename) .. py:method:: _print_answer(answer) :staticmethod: