127.0.0.1:9200
URL: http://127.0.0.1:9200/likecs_art_db/_search
REQUEST:
Array
(
    [query] => Array
        (
            [match] => Array
                (
                    [text] => Array
                        (
                            [query] => 逻辑回归损失函数推导
                        )

                )

        )

    [highlight] => Array
        (
            [fields] => Array
                (
                    [text] => stdClass Object
                        (
                        )

                )

            [pre_tags] => #em#
            [post_tags] => #/em#
        )

    [size] => 8
    [from] => 0
)
RESPONSE:
string(7861) "{"took":22,"timed_out":false,"_shards":{"total":1,"successful":1,"skipped":0,"failed":0},"hits":{"total":{"value":10000,"relation":"gte"},"max_score":53.987785,"hits":[{"_index":"likecs_art_db","_type":"_doc","_id":"54860","_score":53.987785,"_source":{"id":"54860","text":"\u903b\u8f91\u56de\u5f52\u635f\u5931\u51fd\u6570\u63a8\u5bfc","intro":"\u76ee\u5f55\n\nECharts\n\u5f02\u6b65\u52a0\u8f7d\n\n\n\nECharts\r\n\u6570\u636e\u53ef\u89c6\u5316\u5728\u8fc7\u53bb\u51e0\u5e74\u4e2d\u53d6\u5f97\u4e86\u5de8\u5927\u8fdb\u5c55\u3002\u5f00\u53d1\u4eba\u5458\u5bf9\u53ef\u89c6\u5316\u4ea7\u54c1\u7684\u671f\u671b\u4e0d\u518d\u662f\u7b80\u5355\u7684\u56fe\u8868\u521b\u5efa\u5de5\u5177\uff0c\u800c\u662f\u5728\u4ea4\u4e92\u3001\u6027\u80fd\u3001\u6570\u636e\u5904\u7406\u7b49\u65b9\u9762\u6709\u66f4\u9ad8\u7684\u8981\u6c42\u3002\r\nchart.setOption({\r\n    color: [\r\n        ","username":"shayue","tagsname":"\u673a\u5668\u5b66\u4e60|\u903b\u8f91\u56de\u5f52","tagsid":"[\"171\",\"17849\"]","catesname":"\u673a\u5668\u5b66\u4e60\u57fa\u77f3","catesid":"[\"6757\"]","createtime":"1552404591"},"highlight":{"text":["#em#逻#/em##em#辑#/em##em#回#/em##em#归#/em##em#损#/em##em#失#/em##em#函#/em##em#数#/em##em#推#/em##em#导#/em#"]}},{"_index":"likecs_art_db","_type":"_doc","_id":"228931","_score":36.715263,"_source":{"id":"228931","text":"\u903b\u8f91\u56de\u5f52\u4ee3\u4ef7\u51fd\u6570\u7684\u8be6\u7ec6\u63a8\u5bfc","intro":"\u76ee\u5f55\n\nECharts\n\u5f02\u6b65\u52a0\u8f7d\n\n\n\nECharts\r\n\u6570\u636e\u53ef\u89c6\u5316\u5728\u8fc7\u53bb\u51e0\u5e74\u4e2d\u53d6\u5f97\u4e86\u5de8\u5927\u8fdb\u5c55\u3002\u5f00\u53d1\u4eba\u5458\u5bf9\u53ef\u89c6\u5316\u4ea7\u54c1\u7684\u671f\u671b\u4e0d\u518d\u662f\u7b80\u5355\u7684\u56fe\u8868\u521b\u5efa\u5de5\u5177\uff0c\u800c\u662f\u5728\u4ea4\u4e92\u3001\u6027\u80fd\u3001\u6570\u636e\u5904\u7406\u7b49\u65b9\u9762\u6709\u66f4\u9ad8\u7684\u8981\u6c42\u3002\r\nchart.setOption({\r\n    color: [\r\n        ","username":"loongofqiao","tagsname":"","tagsid":"","catesname":null,"catesid":"","createtime":"1629524609"},"highlight":{"text":["#em#逻#/em##em#辑#/em##em#回#/em##em#归#/em#代价#em#函#/em##em#数#/em#的详细#em#推#/em##em#导#/em#"]}},{"_index":"likecs_art_db","_type":"_doc","_id":"961770","_score":35.710526,"_source":{"id":"961770","text":"\u903b\u8f91\u56de\u5f52\uff08Logistic Regression\uff09\u63a8\u5bfc","intro":"\u76ee\u5f55\n\nECharts\n\u5f02\u6b65\u52a0\u8f7d\n\n\n\nECharts\r\n\u6570\u636e\u53ef\u89c6\u5316\u5728\u8fc7\u53bb\u51e0\u5e74\u4e2d\u53d6\u5f97\u4e86\u5de8\u5927\u8fdb\u5c55\u3002\u5f00\u53d1\u4eba\u5458\u5bf9\u53ef\u89c6\u5316\u4ea7\u54c1\u7684\u671f\u671b\u4e0d\u518d\u662f\u7b80\u5355\u7684\u56fe\u8868\u521b\u5efa\u5de5\u5177\uff0c\u800c\u662f\u5728\u4ea4\u4e92\u3001\u6027\u80fd\u3001\u6570\u636e\u5904\u7406\u7b49\u65b9\u9762\u6709\u66f4\u9ad8\u7684\u8981\u6c42\u3002\r\nchart.setOption({\r\n    color: [\r\n        ","username":"lxs0731","tagsname":null,"tagsid":"","catesname":null,"catesid":"","createtime":"1639327343"},"highlight":{"text":["#em#逻#/em##em#辑#/em##em#回#/em##em#归#/em#(Logistic Regression)#em#推#/em##em#导#/em#"]}},{"_index":"likecs_art_db","_type":"_doc","_id":"228930","_score":35.568886,"_source":{"id":"228930","text":"\u903b\u8f91\u56de\u5f52\u6a21\u578b\u53ca\u5176\u4ee3\u4ef7\u51fd\u6570\u63a8\u5bfc","intro":"\u76ee\u5f55\n\nECharts\n\u5f02\u6b65\u52a0\u8f7d\n\n\n\nECharts\r\n\u6570\u636e\u53ef\u89c6\u5316\u5728\u8fc7\u53bb\u51e0\u5e74\u4e2d\u53d6\u5f97\u4e86\u5de8\u5927\u8fdb\u5c55\u3002\u5f00\u53d1\u4eba\u5458\u5bf9\u53ef\u89c6\u5316\u4ea7\u54c1\u7684\u671f\u671b\u4e0d\u518d\u662f\u7b80\u5355\u7684\u56fe\u8868\u521b\u5efa\u5de5\u5177\uff0c\u800c\u662f\u5728\u4ea4\u4e92\u3001\u6027\u80fd\u3001\u6570\u636e\u5904\u7406\u7b49\u65b9\u9762\u6709\u66f4\u9ad8\u7684\u8981\u6c42\u3002\r\nchart.setOption({\r\n    color: [\r\n        ","username":"mtcnn","tagsname":"","tagsid":"","catesname":null,"catesid":"","createtime":"1629524599"},"highlight":{"text":["#em#逻#/em##em#辑#/em##em#回#/em##em#归#/em#模型及其代价#em#函#/em##em#数#/em##em#推#/em##em#导#/em#"]}},{"_index":"likecs_art_db","_type":"_doc","_id":"203395422","_score":34.46506,"_source":{"id":"203395422","text":"\u56de\u5f52\u635f\u5931\u51fd\u6570: L1 Loss","intro":"\u76ee\u5f55\n\nECharts\n\u5f02\u6b65\u52a0\u8f7d\n\n\n\nECharts\r\n\u6570\u636e\u53ef\u89c6\u5316\u5728\u8fc7\u53bb\u51e0\u5e74\u4e2d\u53d6\u5f97\u4e86\u5de8\u5927\u8fdb\u5c55\u3002\u5f00\u53d1\u4eba\u5458\u5bf9\u53ef\u89c6\u5316\u4ea7\u54c1\u7684\u671f\u671b\u4e0d\u518d\u662f\u7b80\u5355\u7684\u56fe\u8868\u521b\u5efa\u5de5\u5177\uff0c\u800c\u662f\u5728\u4ea4\u4e92\u3001\u6027\u80fd\u3001\u6570\u636e\u5904\u7406\u7b49\u65b9\u9762\u6709\u66f4\u9ad8\u7684\u8981\u6c42\u3002\r\nchart.setOption({\r\n    color: [\r\n        ","username":"","tagsname":null,"tagsid":"","catesname":null,"catesid":"","createtime":"1620240268"},"highlight":{"text":["#em#回#/em##em#归#/em##em#损#/em##em#失#/em##em#函#/em##em#数#/em#: L1 Loss"]}},{"_index":"likecs_art_db","_type":"_doc","_id":"484269","_score":34.33963,"_source":{"id":"484269","text":"logstic\u56de\u5f52\u635f\u5931\u51fd\u6570\u53ca\u68af\u5ea6\u4e0b\u964d\u516c\u5f0f\u63a8\u5bfc","intro":"\u76ee\u5f55\n\nECharts\n\u5f02\u6b65\u52a0\u8f7d\n\n\n\nECharts\r\n\u6570\u636e\u53ef\u89c6\u5316\u5728\u8fc7\u53bb\u51e0\u5e74\u4e2d\u53d6\u5f97\u4e86\u5de8\u5927\u8fdb\u5c55\u3002\u5f00\u53d1\u4eba\u5458\u5bf9\u53ef\u89c6\u5316\u4ea7\u54c1\u7684\u671f\u671b\u4e0d\u518d\u662f\u7b80\u5355\u7684\u56fe\u8868\u521b\u5efa\u5de5\u5177\uff0c\u800c\u662f\u5728\u4ea4\u4e92\u3001\u6027\u80fd\u3001\u6570\u636e\u5904\u7406\u7b49\u65b9\u9762\u6709\u66f4\u9ad8\u7684\u8981\u6c42\u3002\r\nchart.setOption({\r\n    color: [\r\n        ","username":"peterwong666","tagsname":"","tagsid":"","catesname":null,"catesid":"","createtime":"1638879237"},"highlight":{"text":["logstic#em#回#/em##em#归#/em##em#损#/em##em#失#/em##em#函#/em##em#数#/em#及梯度下降公式#em#推#/em##em#导#/em#"]}},{"_index":"likecs_art_db","_type":"_doc","_id":"228900","_score":33.338943,"_source":{"id":"228900","text":"\u903b\u8f91\u56de\u5f52\u2014\u2014\u4ee3\u4ef7\u51fd\u6570","intro":"\u76ee\u5f55\n\nECharts\n\u5f02\u6b65\u52a0\u8f7d\n\n\n\nECharts\r\n\u6570\u636e\u53ef\u89c6\u5316\u5728\u8fc7\u53bb\u51e0\u5e74\u4e2d\u53d6\u5f97\u4e86\u5de8\u5927\u8fdb\u5c55\u3002\u5f00\u53d1\u4eba\u5458\u5bf9\u53ef\u89c6\u5316\u4ea7\u54c1\u7684\u671f\u671b\u4e0d\u518d\u662f\u7b80\u5355\u7684\u56fe\u8868\u521b\u5efa\u5de5\u5177\uff0c\u800c\u662f\u5728\u4ea4\u4e92\u3001\u6027\u80fd\u3001\u6570\u636e\u5904\u7406\u7b49\u65b9\u9762\u6709\u66f4\u9ad8\u7684\u8981\u6c42\u3002\r\nchart.setOption({\r\n    color: [\r\n        ","username":"qkloveslife","tagsname":"","tagsid":"","catesname":null,"catesid":"","createtime":"1629524491"},"highlight":{"text":["#em#逻#/em##em#辑#/em##em#回#/em##em#归#/em#——代价#em#函#/em##em#数#/em#"]}},{"_index":"likecs_art_db","_type":"_doc","_id":"203428700","_score":33.338943,"_source":{"id":"203428700","text":"Sigmoid\u51fd\u6570\u4e0e\u903b\u8f91\u56de\u5f52","intro":"\u76ee\u5f55\n\nECharts\n\u5f02\u6b65\u52a0\u8f7d\n\n\n\nECharts\r\n\u6570\u636e\u53ef\u89c6\u5316\u5728\u8fc7\u53bb\u51e0\u5e74\u4e2d\u53d6\u5f97\u4e86\u5de8\u5927\u8fdb\u5c55\u3002\u5f00\u53d1\u4eba\u5458\u5bf9\u53ef\u89c6\u5316\u4ea7\u54c1\u7684\u671f\u671b\u4e0d\u518d\u662f\u7b80\u5355\u7684\u56fe\u8868\u521b\u5efa\u5de5\u5177\uff0c\u800c\u662f\u5728\u4ea4\u4e92\u3001\u6027\u80fd\u3001\u6570\u636e\u5904\u7406\u7b49\u65b9\u9762\u6709\u66f4\u9ad8\u7684\u8981\u6c42\u3002\r\nchart.setOption({\r\n    color: [\r\n        ","username":"","tagsname":null,"tagsid":"","catesname":null,"catesid":"","createtime":"1617716193"},"highlight":{"text":["Sigmoid#em#函#/em##em#数#/em#与#em#逻#/em##em#辑#/em##em#回#/em##em#归#/em#"]}}]}}"
127.0.0.1:9200
URL: http://127.0.0.1:9200/likecs_art_db/_search
REQUEST:
Array
(
    [query] => Array
        (
            [match] => Array
                (
                    [text] => Array
                        (
                            [query] => 逻辑回归损失函数推导
                        )

                )

        )

    [highlight] => Array
        (
            [fields] => Array
                (
                    [text] => stdClass Object
                        (
                        )

                )

            [pre_tags] => #em#
            [post_tags] => #/em#
        )

    [size] => 8
    [from] => 8
)
RESPONSE:
string(8205) "{"took":18,"timed_out":false,"_shards":{"total":1,"successful":1,"skipped":0,"failed":0},"hits":{"total":{"value":10000,"relation":"gte"},"max_score":53.987785,"hits":[{"_index":"likecs_art_db","_type":"_doc","_id":"203283952","_score":33.25441,"_source":{"id":"203283952","text":"\u6df1\u5ea6\u5b66\u4e60\u4e2d\uff0c\u903b\u8f91\u56de\u5f52(\u4ea4\u53c9\u71b5)\uff0csoftmax\u635f\u5931\u51fd\u6570\u7684\u63a8\u5bfc\uff0c\u4ee5\u53casigmoid, relu, tanh, softmax\u51fd\u6570\u7684\u7528\u5904","intro":"\u76ee\u5f55\n\nECharts\n\u5f02\u6b65\u52a0\u8f7d\n\n\n\nECharts\r\n\u6570\u636e\u53ef\u89c6\u5316\u5728\u8fc7\u53bb\u51e0\u5e74\u4e2d\u53d6\u5f97\u4e86\u5de8\u5927\u8fdb\u5c55\u3002\u5f00\u53d1\u4eba\u5458\u5bf9\u53ef\u89c6\u5316\u4ea7\u54c1\u7684\u671f\u671b\u4e0d\u518d\u662f\u7b80\u5355\u7684\u56fe\u8868\u521b\u5efa\u5de5\u5177\uff0c\u800c\u662f\u5728\u4ea4\u4e92\u3001\u6027\u80fd\u3001\u6570\u636e\u5904\u7406\u7b49\u65b9\u9762\u6709\u66f4\u9ad8\u7684\u8981\u6c42\u3002\r\nchart.setOption({\r\n    color: [\r\n        ","username":"","tagsname":null,"tagsid":"","catesname":null,"catesid":"","createtime":"1636473242"},"highlight":{"text":["深度学习中,#em#逻#/em##em#辑#/em##em#回#/em##em#归#/em#(交叉熵),softmax#em#损#/em##em#失#/em##em#函#/em##em#数#/em#的#em#推#/em##em#导#/em#,以及sigmoid, relu, tanh, softmax#em#函#/em##em#数#/em#的用处"]}},{"_index":"likecs_art_db","_type":"_doc","_id":"203369750","_score":33.162243,"_source":{"id":"203369750","text":"\u903b\u8f91\u56de\u5f52\u68af\u5ea6\u4e0b\u964d\u63a8\u5bfc","intro":"\u76ee\u5f55\n\nECharts\n\u5f02\u6b65\u52a0\u8f7d\n\n\n\nECharts\r\n\u6570\u636e\u53ef\u89c6\u5316\u5728\u8fc7\u53bb\u51e0\u5e74\u4e2d\u53d6\u5f97\u4e86\u5de8\u5927\u8fdb\u5c55\u3002\u5f00\u53d1\u4eba\u5458\u5bf9\u53ef\u89c6\u5316\u4ea7\u54c1\u7684\u671f\u671b\u4e0d\u518d\u662f\u7b80\u5355\u7684\u56fe\u8868\u521b\u5efa\u5de5\u5177\uff0c\u800c\u662f\u5728\u4ea4\u4e92\u3001\u6027\u80fd\u3001\u6570\u636e\u5904\u7406\u7b49\u65b9\u9762\u6709\u66f4\u9ad8\u7684\u8981\u6c42\u3002\r\nchart.setOption({\r\n    color: [\r\n        ","username":"","tagsname":null,"tagsid":"","catesname":null,"catesid":"","createtime":"1634437674"},"highlight":{"text":["#em#逻#/em##em#辑#/em##em#回#/em##em#归#/em#梯度下降#em#推#/em##em#导#/em#"]}},{"_index":"likecs_art_db","_type":"_doc","_id":"54689","_score":32.005657,"_source":{"id":"54689","text":"\u7ebf\u6027\u56de\u5f52\u635f\u5931\u51fd\u6570\u6c42\u89e3","intro":"\u76ee\u5f55\n\nECharts\n\u5f02\u6b65\u52a0\u8f7d\n\n\n\nECharts\r\n\u6570\u636e\u53ef\u89c6\u5316\u5728\u8fc7\u53bb\u51e0\u5e74\u4e2d\u53d6\u5f97\u4e86\u5de8\u5927\u8fdb\u5c55\u3002\u5f00\u53d1\u4eba\u5458\u5bf9\u53ef\u89c6\u5316\u4ea7\u54c1\u7684\u671f\u671b\u4e0d\u518d\u662f\u7b80\u5355\u7684\u56fe\u8868\u521b\u5efa\u5de5\u5177\uff0c\u800c\u662f\u5728\u4ea4\u4e92\u3001\u6027\u80fd\u3001\u6570\u636e\u5904\u7406\u7b49\u65b9\u9762\u6709\u66f4\u9ad8\u7684\u8981\u6c42\u3002\r\nchart.setOption({\r\n    color: [\r\n        ","username":"shayue","tagsname":"\u673a\u5668\u5b66\u4e60|\u7ebf\u6027\u4ee3\u6570|\u6700\u5c0f\u4e8c\u4e58\u6cd5","tagsid":"[\"171\",\"2873\",\"18491\"]","catesname":"\u7ebf\u6027\u4ee3\u6570","catesid":"[\"3847\"]","createtime":"1552316788"},"highlight":{"text":["线性#em#回#/em##em#归#/em##em#损#/em##em#失#/em##em#函#/em##em#数#/em#求解"]}},{"_index":"likecs_art_db","_type":"_doc","_id":"755220","_score":32.000816,"_source":{"id":"755220","text":"\u3010353\u3011\u7ebf\u6027\u56de\u5f52\u635f\u5931\u51fd\u6570\u6c42\u5bfc\u4e3e\u4f8b - McDelfino","intro":"\u76ee\u5f55\n\nECharts\n\u5f02\u6b65\u52a0\u8f7d\n\n\n\nECharts\r\n\u6570\u636e\u53ef\u89c6\u5316\u5728\u8fc7\u53bb\u51e0\u5e74\u4e2d\u53d6\u5f97\u4e86\u5de8\u5927\u8fdb\u5c55\u3002\u5f00\u53d1\u4eba\u5458\u5bf9\u53ef\u89c6\u5316\u4ea7\u54c1\u7684\u671f\u671b\u4e0d\u518d\u662f\u7b80\u5355\u7684\u56fe\u8868\u521b\u5efa\u5de5\u5177\uff0c\u800c\u662f\u5728\u4ea4\u4e92\u3001\u6027\u80fd\u3001\u6570\u636e\u5904\u7406\u7b49\u65b9\u9762\u6709\u66f4\u9ad8\u7684\u8981\u6c42\u3002\r\nchart.setOption({\r\n    color: [\r\n        ","username":"alex-bn-lee","tagsname":"","tagsid":"","catesname":null,"catesid":"","createtime":"1636572281"},"highlight":{"text":["【353】线性#em#回#/em##em#归#/em##em#损#/em##em#失#/em##em#函#/em##em#数#/em#求#em#导#/em#举例 - McDelfino"]}},{"_index":"likecs_art_db","_type":"_doc","_id":"203369399","_score":30.953428,"_source":{"id":"203369399","text":"\u903b\u8f91\u56de\u5f52\u7b97\u6cd5\u63a8\u5bfc\u53caPython\u5b9e\u73b0","intro":"\u76ee\u5f55\n\nECharts\n\u5f02\u6b65\u52a0\u8f7d\n\n\n\nECharts\r\n\u6570\u636e\u53ef\u89c6\u5316\u5728\u8fc7\u53bb\u51e0\u5e74\u4e2d\u53d6\u5f97\u4e86\u5de8\u5927\u8fdb\u5c55\u3002\u5f00\u53d1\u4eba\u5458\u5bf9\u53ef\u89c6\u5316\u4ea7\u54c1\u7684\u671f\u671b\u4e0d\u518d\u662f\u7b80\u5355\u7684\u56fe\u8868\u521b\u5efa\u5de5\u5177\uff0c\u800c\u662f\u5728\u4ea4\u4e92\u3001\u6027\u80fd\u3001\u6570\u636e\u5904\u7406\u7b49\u65b9\u9762\u6709\u66f4\u9ad8\u7684\u8981\u6c42\u3002\r\nchart.setOption({\r\n    color: [\r\n        ","username":"","tagsname":null,"tagsid":"","catesname":null,"catesid":"","createtime":"1617723431"},"highlight":{"text":["#em#逻#/em##em#辑#/em##em#回#/em##em#归#/em#算法#em#推#/em##em#导#/em#及Python实现"]}},{"_index":"likecs_art_db","_type":"_doc","_id":"203427964","_score":30.953428,"_source":{"id":"203427964","text":"\u3010\u673a\u5668\u5b66\u4e60\u7b97\u6cd5\u63a8\u5bfc\u3011\u903b\u8f91\u56de\u5f52","intro":"\u76ee\u5f55\n\nECharts\n\u5f02\u6b65\u52a0\u8f7d\n\n\n\nECharts\r\n\u6570\u636e\u53ef\u89c6\u5316\u5728\u8fc7\u53bb\u51e0\u5e74\u4e2d\u53d6\u5f97\u4e86\u5de8\u5927\u8fdb\u5c55\u3002\u5f00\u53d1\u4eba\u5458\u5bf9\u53ef\u89c6\u5316\u4ea7\u54c1\u7684\u671f\u671b\u4e0d\u518d\u662f\u7b80\u5355\u7684\u56fe\u8868\u521b\u5efa\u5de5\u5177\uff0c\u800c\u662f\u5728\u4ea4\u4e92\u3001\u6027\u80fd\u3001\u6570\u636e\u5904\u7406\u7b49\u65b9\u9762\u6709\u66f4\u9ad8\u7684\u8981\u6c42\u3002\r\nchart.setOption({\r\n    color: [\r\n        ","username":"","tagsname":null,"tagsid":"","catesname":null,"catesid":"","createtime":"1641500564"},"highlight":{"text":["【机器学习算法#em#推#/em##em#导#/em#】#em#逻#/em##em#辑#/em##em#回#/em##em#归#/em#"]}},{"_index":"likecs_art_db","_type":"_doc","_id":"293446","_score":30.679726,"_source":{"id":"293446","text":"\u903b\u8f91\u56de\u5f52\u4e0e\u591a\u9879\u903b\u8f91\u56de\u5f52","intro":"\u76ee\u5f55\n\nECharts\n\u5f02\u6b65\u52a0\u8f7d\n\n\n\nECharts\r\n\u6570\u636e\u53ef\u89c6\u5316\u5728\u8fc7\u53bb\u51e0\u5e74\u4e2d\u53d6\u5f97\u4e86\u5de8\u5927\u8fdb\u5c55\u3002\u5f00\u53d1\u4eba\u5458\u5bf9\u53ef\u89c6\u5316\u4ea7\u54c1\u7684\u671f\u671b\u4e0d\u518d\u662f\u7b80\u5355\u7684\u56fe\u8868\u521b\u5efa\u5de5\u5177\uff0c\u800c\u662f\u5728\u4ea4\u4e92\u3001\u6027\u80fd\u3001\u6570\u636e\u5904\u7406\u7b49\u65b9\u9762\u6709\u66f4\u9ad8\u7684\u8981\u6c42\u3002\r\nchart.setOption({\r\n    color: [\r\n        ","username":"soyo","tagsname":"","tagsid":"","catesname":"","catesid":"","createtime":"1631016835"},"highlight":{"text":["#em#逻#/em##em#辑#/em##em#回#/em##em#归#/em#与多项#em#逻#/em##em#辑#/em##em#回#/em##em#归#/em#"]}},{"_index":"likecs_art_db","_type":"_doc","_id":"484265","_score":30.14445,"_source":{"id":"484265","text":"\u7b2c\u4e09\u5468\uff1a\u903b\u8f91\u56de\u5f52\u4ee3\u4ef7\u51fd\u6570\u6c42\u5bfc\u8fc7\u7a0b","intro":"\u76ee\u5f55\n\nECharts\n\u5f02\u6b65\u52a0\u8f7d\n\n\n\nECharts\r\n\u6570\u636e\u53ef\u89c6\u5316\u5728\u8fc7\u53bb\u51e0\u5e74\u4e2d\u53d6\u5f97\u4e86\u5de8\u5927\u8fdb\u5c55\u3002\u5f00\u53d1\u4eba\u5458\u5bf9\u53ef\u89c6\u5316\u4ea7\u54c1\u7684\u671f\u671b\u4e0d\u518d\u662f\u7b80\u5355\u7684\u56fe\u8868\u521b\u5efa\u5de5\u5177\uff0c\u800c\u662f\u5728\u4ea4\u4e92\u3001\u6027\u80fd\u3001\u6570\u636e\u5904\u7406\u7b49\u65b9\u9762\u6709\u66f4\u9ad8\u7684\u8981\u6c42\u3002\r\nchart.setOption({\r\n    color: [\r\n        ","username":"zhongmiaozhimen","tagsname":"","tagsid":"","catesname":null,"catesid":"","createtime":"1634321329"},"highlight":{"text":["第三周:#em#逻#/em##em#辑#/em##em#回#/em##em#归#/em#代价#em#函#/em##em#数#/em#求#em#导#/em#过程"]}}]}}"
127.0.0.1:9200
URL: http://192.168.101.128/searchcore/index.php/cihere_cn_db/_search
REQUEST:
Array
(
    [query] => Array
        (
            [match] => Array
                (
                    [title] => Array
                        (
                            [query] => 逻辑回归损失函数推导
                        )

                )

        )

    [highlight] => Array
        (
            [fields] => Array
                (
                    [title] => stdClass Object
                        (
                        )

                )

            [pre_tags] => #em#
            [post_tags] => #/em#
        )

    [from] => 0
)
RESPONSE:
bool(false)
127.0.0.1:9200
URL: http://127.0.0.1:9200/likecs_down_db/_search
REQUEST:
Array
(
    [query] => Array
        (
            [bool] => Array
                (
                    [must] => Array
                        (
                            [0] => Array
                                (
                                    [match] => Array
                                        (
                                            [title] => Array
                                                (
                                                    [query] => 逻辑回归损失函数推导
                                                )

                                        )

                                )

                        )

                    [must_not] => Array
                        (
                            [0] => Array
                                (
                                    [term] => Array
                                        (
                                            [cate1] => 电子书籍
                                        )

                                )

                        )

                )

        )

    [highlight] => Array
        (
            [fields] => Array
                (
                    [title] => stdClass Object
                        (
                        )

                )

            [pre_tags] => #em#
            [post_tags] => #/em#
        )

    [size] => 5
    [from] => 0
)
RESPONSE:
string(2957) "{"took":10,"timed_out":false,"_shards":{"total":1,"successful":1,"skipped":0,"failed":0},"hits":{"total":{"value":2655,"relation":"eq"},"max_score":17.191544,"hits":[{"_index":"likecs_down_db","_type":"_doc","_id":"68876","_score":17.191544,"_source":{"id":"68876","title":"\u6613\u8bed\u8a00\u786c\u76d8\u903b\u8f91\u9501","spidertime":"1623080450","contenttime":"1624996047","pageimage":"https:\/\/img.jbzj.com\/file_images\/article\/201811\/20181106103934.jpg","tag":"\u6613\u8bed\u8a00|\u786c\u76d8|\u903b\u8f91\u9501","cate1":"\u6e90\u7801\u4e0b\u8f7d","cate2":"\u8f6f\u4ef6\u5f00\u53d1","cate3":"\u6613\u8bed\u8a00\u6e90\u7801","attr1":"49KB"},"highlight":{"title":["易语言硬盘#em#逻#/em##em#辑#/em#锁"]}},{"_index":"likecs_down_db","_type":"_doc","_id":"6222","_score":17.191544,"_source":{"id":"6222","title":"\u6613\u8bed\u8a00\u786c\u76d8\u903b\u8f91\u9501","spidertime":"1622872198","contenttime":"1671385795","pageimage":"https:\/\/img.jbzj.com\/file_images\/article\/201811\/20181106103934.jpg","tag":"\u6613\u8bed\u8a00|\u786c\u76d8|\u903b\u8f91\u9501","cate1":"\u6e90\u7801\u4e0b\u8f7d","cate2":"\u8f6f\u4ef6\u5f00\u53d1","cate3":"\u6613\u8bed\u8a00\u6e90\u7801","attr1":"49KB"},"highlight":{"title":["易语言硬盘#em#逻#/em##em#辑#/em#锁"]}},{"_index":"likecs_down_db","_type":"_doc","_id":"71965","_score":16.569603,"_source":{"id":"71965","title":"\u4fee\u590dWord\u627e\u56de\u4e22\u5931\u7684\u6570\u636e \u4e2d\u6587 PDF \u9ad8\u6e05\u7248","spidertime":"1623127820","contenttime":"1625124798","pageimage":"https:\/\/img.jbzj.com\/do\/uploads\/litimg\/120731\/1510331E260.gif","tag":"\u4fee\u590dWord|\u627e\u56de\u4e22\u5931\u7684\u6570\u636e","cate1":"\u7535\u5b50\u4e66\u7c4d","cate2":"\u529e\u516c\u6559\u80b2","attr1":"2.07MB"},"highlight":{"title":["修复Word找#em#回#/em#丢#em#失#/em#的#em#数#/em#据 中文 PDF 高清版"]}},{"_index":"likecs_down_db","_type":"_doc","_id":"58534","_score":14.904851,"_source":{"id":"58534","title":"Matlab\u4e2d\u7684\u903b\u8f91\u8fd0\u7b97 \u4e2d\u6587WORD\u7248","spidertime":"1623065564","contenttime":"1679248861","pageimage":"https:\/\/img.jbzj.com\/do\/uploads\/litimg\/161129\/155124593321.png","tag":"Matlab|\u903b\u8f91\u8fd0\u7b97","cate1":"\u7535\u5b50\u4e66\u7c4d","cate2":"\u7f16\u7a0b\u5f00\u53d1","cate3":"matlab","attr1":"8.51KB"},"highlight":{"title":["Matlab中的#em#逻#/em##em#辑#/em#运算 中文WORD版"]}},{"_index":"likecs_down_db","_type":"_doc","_id":"7264","_score":14.904851,"_source":{"id":"7264","title":"Matlab\u4e2d\u7684\u903b\u8f91\u8fd0\u7b97 \u4e2d\u6587WORD\u7248","spidertime":"1622873451","contenttime":"1679458801","pageimage":"https:\/\/img.jbzj.com\/do\/uploads\/litimg\/161129\/155124593321.png","tag":"Matlab|\u903b\u8f91\u8fd0\u7b97","cate1":"\u7535\u5b50\u4e66\u7c4d","cate2":"\u7f16\u7a0b\u5f00\u53d1","cate3":"matlab","attr1":"8.51KB"},"highlight":{"title":["Matlab中的#em#逻#/em##em#辑#/em#运算 中文WORD版"]}}]}}"
逻辑回归损失函数推导 - 爱码网
shayue

逻辑回归

引言

假设今天希望将机器学习应用到医院中去,比如对于某一个患了心脏病的病人,求他3个月之后病危的概率。那么我们该选择哪一个模型,或者可以尝试已经学过的线性回归?

但是很遗憾的是,如果我们要利用线性回归,我们收集到的资料中应当包含病人3个月后病危的概率。这在实际中是很难得到的,因为对于一个患病的病人,你只能知道他3个月后到底是病危或者存活。所以线性回归并不适用这种场景。

logistic函数

上面提到我们最终的目标是一个概率值\(P(y|x)\),这里\(y=+1\)指代病人3个月后病危这个事件;\(y=-1\)指代病人3个月后存活这个事件。显然\(P(-1|x) = 1 - P(1|x)\).

我们先前学过线性回归,知道可以通过加权的方式求出各项特征的'分数',那这个分数怎么转换为一个概率值?这里就需要引入一个logistic函数。它的表达式为:\[ \theta(s)=\frac{1}{1+e^{-s}} \]
它的图像如下所示:

可以看到这个函数有十分不错的性质:

  1. \(s(-∞)=0, \ s(+∞)=1\)
  2. \(1-\theta(s)=\theta(-s)\)

也就是说我们可以把加权得到的'分数'通过logistic函数转化为一个概率值,并且加权得到的'分数'越大,这个概率值也越大。这真的还蛮有道理的。

好了,我们的模型已经定义完毕了,称它为逻辑回归模型:\[ \begin{equation} h(x) = \frac{1}{1+e^{-w^Tx}} \ \ \ \ \ w,x都是向量 \end{equation} \]
也就是说,我们获取到一个病人的特征\(x\),将它输入模型,就能知晓这个病人3个月后病危的概率。但是,还有最重要的一步,这个模型的参数\(w\)如何确定?不同的参数\(w\)会带来不同的模型\(h(x)\).经验告诉我们可以从已获得的资料中找到一些端倪获取最合适的\(w\)

损失函数

线性回归中,我们定义了一个平方损失函数,通过对损失函数求导数得到最后的参数。那依样画葫芦,我们也为逻辑回归定义一个损失函数,然后试着对损失函数求梯度,是不是能解出最后的参数了。那么想一下,逻辑回归的损失函数如何定义?还用最小二乘法么?这显然不符合场景,毕竟已有的资料只告诉我们每一组数据对应的结果是哪一类的。

我们还是从数据的产生来分析,现在已有的数据是这些:\[ D = {(x_1, 1), (x_2, 1), (x_3, 1), ... , (x_n, -1)} \]
当然,这些数据的产生是相互独立的,所以获得\(D\)这笔资料的概率就是\[ \begin{equation} P(x_1, 1) * P(x_2, 1) * P(x_3, 1) * ... * P(x_n, -1) \end{equation}\]
再将(2)式写为条件概率分布\[ \begin{equation} P(x_1)P(1|x_1) * P(x_2)P(1|x_2) * P(x_3)P(1|x_3) * ... * P(x_n)P(-1|x_n) \end{equation}\]
再者,假设每一笔数据的产生服从0-1分布。\[\begin{equation} P(y|x_i) = \left \{ \begin{array}{lr} f(x_i) \ \ \ \ \ \ \ \ \ \ y=+1 \\ 1 - f(x_i) \ \ \ \ \ y=-1 \end{array} \right. \end{equation}\]
所以最后写成的形式:\[\begin{equation} P(x_1)f(x_1) * P(x_2)f(x_2) * P(x_3)f(x_3) * ... * P(x_n)(1-f(x_n)) \end{equation}\]

也就说这笔资料\(D\)由真正的模型\(f(x)\)产生的话,概率是(5)这么大。但是我们不知道真正的模型f(x)长什么样子,我们现在只知道我们自己定义了一个模型\(h(x)\),它长成(1)这个样子。所以现在的任务就是从很多的\(h(x)_1, h(x)_2, h(x)_3, ..., h(x)_m\)中找到其中一个最接近真正的模型\(f(x)\)并将它作为我们最后的\(h(x)\)

所以如何衡量\(h(x)\)\(f(x)\)的接近程度?如果我们现在用\(h(x)\)代替\(f(x)\)去产生这组数据集\(D\)也能得到一个概率(6).\[\begin{equation} P(x_1)h(x_1) * P(x_2)h(x_2) * P(x_3)h(x_3) * ... * P(x_n)(1-h(x_n)) \end{equation}\]

使得(6)式的概率最大的那个\(h(x)\)我们会认为它与\(f(x)\)最相似,这就是最大似然的思想。又因为对于所有的\(h(x)_i\)产生的概率:\[\begin{equation} P(x_1) * P(x_2) * P(x_3) * ... * P(x_n) \end{equation}\]
这部分都是相同的,所以我们认为最接近\(f(x)\)\(h(x)\)能使(8)最大即可
\[\begin{equation} h(x_1) * h(x_2) * h(x_3) * ... * (1-h(x_n)) \end{equation}\]
再由于logistic函数的第2个性质,可以将(8)变形:
\[\begin{equation} h(x_1) * h(x_2) * h(x_3) * ... * h(-x_n) \end{equation}\]
最终的目标是解出下面这个优化问题:\[\begin{equation} \mathop{max}\limits_{w} \ \ \prod_{i=1}^{n}h(y_ix_i) \end{equation}\]
再次变形,求一个式子的最大值,相当于求它相反数的最小:\[\begin{equation} \mathop{min}\limits_{w} \ \ -\prod_{i=1}^{n}h(y_ix_i) \end{equation}\]
接下来我们要对(11)式取对数,一方面原因是因为对数函数的单调特性,另一方面是能将原来的连乘简化到连加,所以取对数后:\[\begin{equation} \mathop{min}\limits_{w} \ \ -\sum_{i=1}^{n}\ln{h(y_ix_i)} \end{equation}\]
\(h(x)\)展开,能得到\[\begin{equation} \mathop{min}\limits_{w} \ \ -\sum_{i=1}^{n}\ln{\frac{1}{1+e^{-y_iw^Tx_i}}} \ \ \ \ \ \ \ \ \ \ w与x_i都是向量,x_i表示第i笔数据 \end{equation}\]
再一次\[\begin{equation} \mathop{min}\limits_{w} \ \ \sum_{i=1}^{n}\ln{(1+e^{-y_iw^Tx_i})} \ \ \ \ \ \ \ \ \ \ w与x_i都是向量,x_i表示第i笔数据 \end{equation}\]
大功告成,我们得到了逻辑回归的损失函数,它长成(15)式这个样子\[\begin{equation} J(w)= \sum_{i=1}^{n}\ln{(1+e^{-y_iw^Tx_i})} \ \ \ \ \ \ \ \ \ \ w与x_i都是向量,x_i表示第i笔数据 \end{equation}\]
我们的目标就是找到最小化\(J(w)\)的那个\(w\).就像在线性回归中做的那样,接下来我们要利用链式法则对它求导:\[\begin{equation} \frac{\partial J(w)}{\partial w_j} = \sum_{i=1}^{n}\frac{\partial \ln{(1+e^{-y_iw^Tx_i})}}{\partial (-y_iw^Tx_i)} * \frac{\partial (-y_iw^Tx_i)}{\partial w_j} \end{equation}\]
化解得到\[\begin{equation} \frac{\partial J(w)}{\partial w_j} = \sum_{i=1}^{n}\frac{e^{-y_iw^Tx_i}}{1+e^{-y_iw^Tx_i}} * (-y_ix_{i,j}) \ \ \ \ x_{i,j}是个标量,是第i笔数据中第j个分量 \end{equation}\]
所以对于整个向量\(w\)的梯度为\[ \begin{equation} \frac{\partial J(w)}{\partial w} = \sum_{i=1}^{n}\frac{e^{-y_iw^Tx_i}}{1+e^{-y_iw^Tx_i}} * (-y_ix_i) \ \ \ \ 想象将对单个w_i的结果笔直堆成一个向量 \end{equation}\]
\(\frac{e^{-y_iww^Tx_i}}{1+e^{-y_iww^Tx_i}}\)正好是逻辑回归函数,所以最终对\(w\)的梯度写成下面这个样子\[ \begin{equation} \frac{\partial J(w)}{\partial w} = \sum_{i=1}^{n}h(-y_iw^Tx_i)(-y_ix_i) \end{equation}\]
很遗憾,我们令(19)等于0的话,很难求解出\(w\)。为此,我们需要用额外的方法求解这个问题。

梯度下降

这个可学习的资料太多了,思想就是假设函数上有一个点,它沿着各个方向都有它的方向导数,那么总是沿着方向导数最大的反方向走,也就是梯度的反方向走,这个点总是能走到最低点。每一次移动的距离用一个系数lr来表示,每次更新\(w\),数次迭代之后,\(w\)趋近于最优解:\[ \begin{equation} w_{i+1} := w_{i} - lr * \sum_{i=1}^{n}\frac{e^{-y_iw^Tx_i}}{1+e^{-y_iw^Tx_i}} * (-y_ix_i) \ \ \ \ \ lr是大于0的系数 \end{equation}\]

相关文章:

  • 2021-11-09
  • 2021-10-17
  • 2019-03-11
  • 2021-11-11
  • 2021-04-06
  • 2022-01-07
  • 2021-09-07
  • 2021-10-16
猜你喜欢
  • 2021-08-21
  • 2021-12-13
  • 2021-08-21
  • 2021-05-06
  • 2021-12-07
  • 2021-08-21
  • 2021-04-06
相关资源
相似解决方案