Skip to content

Misc. bug: Returned characters garbled when enabling the logprobs parameter  #11210

Closed as not planned
@Acobarn

Description

@Acobarn

Name and Version

$ ./llama-server --version version: llama-b4453-bin-win-cuda-cu12.4-x64

Operating systems

Windows

Which llama.cpp modules do you know to be affected?

llama-server

Command line

./llama-server -m ./phi-4-Q4_K_L.gguf -c 16384 --host 127.0.0.1 --port 8080 -ngl 41 --path ./public_legacy/

Problem description & steps to reproduce

When enabling the logprobs parameter in llama-server, some of the returned characters appear as garbled text. These garbled characters primarily occur within Kanji characters.
garbled text

garbled text2

Furthermore, due to changes in the returned data format for OAI compatibility, the probability visualization feature in the Legacy UI has also become dysfunctional after this commit. When reproducing this issue using the Legacy UI, it is necessary to adapt the OAI-compatible data format.

First Bad Commit

Relevant log output

http://llamaserver:port/completion or http://llamaserver:port/v1/completions

{"stream":true,"n_predict":400,"temperature":0.7,"stop":[],"repeat_last_n":256,"repeat_penalty":1.18,"dry_multiplier":0,"dry_base":1.75,"dry_allowed_length":2,"dry_penalty_last_n":-1,"top_k":40,"top_p":0.95,"min_p":0.05,"xtc_probability":0,"xtc_threshold":0.1,"typical_p":1,"presence_penalty":0,"frequency_penalty":0,"mirostat":0,"mirostat_tau":5,"mirostat_eta":0.1,"grammar":"","n_probs":2,"min_keep":2,"image_data":[],"cache_prompt":true,"post_sampling_probs":true,"api_key":"","slot_id":-1,"prompt":"はじめまして、私の名前はキャサリンです。よろしくお"}

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions