EDITORIAL 
EDITORIAL POLICY 
Relevance. The 2022 Update of the COPE, DOAJ, OASPA, and WAME joint guidelines on The Principles of Transparency and Best Practice in Scholarly Publishing encourages journals to establish their own policy in “data sharing and reproducibility” (DS&R). However, this document neither provides detailed recommendations / templates nor explains the reproducibility crisis phenomenon.
Objectives. To analyze and interpret the international guidelines, the best practices of global publishers and journals, as well as typical mistakes and experience of selected Russian journals, to help a journal develop its own DS&R policy and its implementation.
Materials and methods. The analysis of various sources (literature, reporting guidelines, data repositories), policies of 83 Russian university journals, as well as policies of the top 5 international publishers and their journals. Interviews with 6 editors-in-chief of Russian journals regarding DS&R.
Results. All the top 5 global publishers in their DS&R policy adapt the TOP Guidelines and offer their own data sharing statement templates. Discussion and interpretations. The author suggests Russian translation of the TOP Guidelines and the main templates (e.g., data sharing statement). He also discusses 9 best journal policies and practices (including pre-registration studies).
Conclusions. Numerous international sources, as well as the experience of selected Russian journals, demonstrate that the implementation of the DS&R policy increases articles citation (averagely by 25.3%), the growth of journal’s bibliometric and altmetric indicators, and also contributes to the trust of the target audience. As a result, it strengthes the journal portfolio to enable publishing articles well ahead of schedule. However, only the declarative statement of DS&R policies by journals without proper implementation does not bring tangible benefits to the journals.
ACADEMIC WRITING 
The importance of correct citation for the development and deepening of scholarly knowledge is evident. At the same time, considering citation in assessing the quality level of a particular paper or journal can be treated differently due to citation practices, thus requiring a systematic analysis, and tracking of their development. The purpose of this article is to present the architecture of the citation landscape current state and comment on steps to avoid common pitfalls in the citation process. The authors have analyzed the most significant foreign studies on the topic, highlighted the main key aspects of the modern citation culture and new trends that can change the citation practice. A typology of citation is offered, the most prominent theories of citing authors behavior are commented, strategies of citing authors behavior and factors that determine the characteristics of citation are described, parameters of high-quality citation are analyzed, and recommendations for the implementation of successful strategies for citing authors behavior are given. The authors have also highlighted new trends in the context of citation (review of the citation context, the phenomenon of unwanted citation) that can significantly transform the citation practice. Understanding the landscape of modern citation culture by citing authors is fundamentally capable of optimizing citation standards, achieving maximum transparency and validity of the scholarly communication context.
A convenient tabular format for the description of statistical methods and software in scientific publications is proposed. It is recommended to include the name of the software, its version or update date, the name of the procedures used and their intent, obligatory references, and URL
SCIENTOMETRICS AND ALTMETRICS 
The article studies the issues related to the compilation of the Russian Journal Whitelist, which is intended to be used in research evaluation. Currently, this list has been approved and posted on the website of the Russian Center for Scientific Information. Building a hierarchy of journals within this list is still under discussion. A number of questions have been raised in the academic community about the composition and principle of compiling the whitelist, and an answer is required. In addition, there are a number of broader questions, in particular, to what extent journal publications are the best way to evaluate research and researchers. I have formulated a number of such questions, inviting readers to reflection and discussion. Despite the difficult situation that has now developed in international scientific communication, one should look at it not only as a crisis, but also as an opportunity to create one of the best systems for research assessment at the moment, free from accumulated bias.
The question of the possible influence of the number of views / downloads of scientific articles from journal websites, as well as the number of their mentions in social networks on the number of subsequent citations of these publications is considered. In particular, an analysis of some of such correlations is carried out on the example of 39 Russian translated journals of biological orientation distributed by the Springer Nature publishing house. Data from 2019–2021 was used regarding the number of article downloads, impact factors of editions, their SJR, CiteScore, SNIP, and usage factors. An analysis of the results obtained, as well as data available on the Internet, allowed the authors to conclude that the relationship between the number of downloads or altmetrics and the number of citations is not very strong, although it is quite reliable. It is emphasized that at present a large number of downloads / views of articles are carried out by users who are not engaged in science and, accordingly, do not write articles for academic journals. They are simply interested in scientific research results, and the Internet is now available to almost anyone. This also applies to the discussion of scientific publications in social networks. Apparently, the frequent mention of work in such networks really stimulates its downloads – however, this becomes obvious only if the article is an open access one, because the majority of “law-abiding” users of social networks do not have legal access to publications in subscription editions. The mentioned circumstances, according to the authors, will lead to a gradual weakening of the correlations considered in the article.
PUBLISHING ETHICS 
India continues to account for a large share not only in publishing predatory journals but also in publishing papers in such journals. The sheer number of academics in India, the continued pressure on them to publish, and the country’s capabilities in information technology are the driving forces. Despite sporadic attempts to find acceptable alternatives to using publication metrics for evaluating scientists and academics, the number of papers published by them in journals, a metric usually refined by introducing some measures of the quality of the journals in which the papers are published, continues to be the most widely used criterion for evaluating research performance. This emphasis favours predatory or deceptive journals because they offer rapid publication and usually have more modest article-processing charges. To prospective authors, such journals often appear indistinguishable from legitimate scholarly journals. This article (1) seeks to bridge that gap in knowledge by suggesting some ways of choosing the right journals and pointing out a number of features of predatory or deceptive journals to help authors to identify and avoid those journals; (2) offers a brief overview of measures taken by the authorities in India to curb predatory journals and the practice of publishing papers in such journals; and (3) suggests some novel ways other than publication metrics of assessing researchers.
As companies advance policies pertaining to social reform, including diversity, equity and inclusion (DEI), the issue of protocol, and how those objectives are being achieved, invites debate. In particular, methods that infringe on authors’ rights or freedoms need to be scrutinized. Online submission systems (OSSs) are typically – and often exclusively – used by authors for submitting their papers. The present paper documents the use of OSSs by 33 journals published by Elsevier to harvest authors’ responses to issues and policies related to DEI. This is achieved via a mandatory survey prior to accessing the OSS. Here, a major concern is the violation of authors’ rights due to the presence of a barrier to entry to the OSS, which prevents them from submitting a paper and thus contravenes a core principle of DEI. Results of an investigation into the transparency of Elsevier’s 33 journals with regard to the same DEI principles that they require of their contributing authors revealed four main findings with regard to the gender diversity of their editorial boards: 1) in only six journals (18%) did 100% of the editors indicate their gender; 2) in 14 journals (42%), the editorial board page of the journal did not carry any statistics related to gender; 3) in five journals (15%), some editors preferred not to disclose their gender (in the case of Discourse, Context & Media, 33% of the responding editors preferred not to disclose their gender); 4) in all journals for which gender statistics were supplied (19, or 58%), none of the responding editors indicated a “non-binary or gender diverse” status. This paper suggests that Elsevier needs to revisit and reform its DEI policies related to editorial boards, as well as to rethink the current mandatory survey for authors using its journals’ OSSs.
ISSN 2541-8122 (Online)